As marine mammals evolved to make water their primary habitat, they lost the ability to make a protein that defends humans and other land-dwelling mammals from the neurotoxic effects of a popular human-made pesticide.
The implications of this discovery, announced today in Science, led researchers to call for monitoring our waterways to learn more about the impact of pesticides and agricultural run-off on marine mammals, such as dolphins, manatees, seals and whales. The research also may shed further light on the function of the gene encoding this protein in humans.
“We need to determine if marine mammals are, indeed, at an elevated risk of serious neurological damage from these pesticides because they biologically lack the ability to break them down, or if they’ve somehow adapted to avoid such damage in an as-yet undiscovered way,” associate professor in Pitt’s Department of Computational and Systems Biology, and the Pittsburgh Center for Evolutionary Biology and Medicine. “Either way, this is the kind of serendipitous finding that results from curiosity-driven scientific research. It is helping us to understand what our genes are doing and the impact the environment can have on them.”
a postdoctoral associate in his laboratory, knew from previous research by other scientists that some genes behind smelling and tasting lost their function during the evolution of marine mammals. They set out to see what other genes conserved in land-dwelling mammals had lost function in marine mammals.
By analyzing DNA sequences from five species of marine mammals and 53 species of terrestrial mammals, was the gene that best matched the pattern of losing function in marine mammals while retaining function in all terrestrial mammals. PON1 even beat out several genes responsible for smell and taste, senses that marine mammals don’t rely on much.
In humans and other terrestrial mammals, PON1 reduces cellular damage caused by unstable oxygen atoms. It also protects us from organophosphates, some of which are pesticides that kill insects — which lack PON1 — by disrupting their neurological systems.
Clark and Meyer worked with Joseph Gaspard, Ph.D., director of science and conservation at the Pittsburgh Zoo & PPG Aquarium, now a scientist emeritus at the U.S. Geological Survey’s Wetland and Aquatic Research Center, to obtain marine mammal blood samples from U.S. and international scientists and conservation biologists. Collaborators at the University of Washington reacted blood samples from several marine mammals with an organophosphate byproduct and observed what happened. The blood did not break down the organophosphate byproduct the way it does in land mammals, indicating that, unless a different biological mechanism is protecting the marine mammals, they would be susceptible to “organophosphate poisoning,” a form of poisoning that results from the buildup of chemical signals in the body, especially the brain.
In an attempt to learn why marine mammals lost PON1 function, the researchers traced back when the function was lost in three different groups of marine mammals. Whales and dolphins lost it soon after they split from their common ancestor with hippopotamuses 53 million years ago; manatees lost it after their split from their common ancestor with elephants 64 million years ago. But some seals likely lost PON1 function more recently, at most 21 million years ago and possibly in very recent times.
“The big question is, why did they lose function at PON1 in the first place?” said Meyer. “It’s hard to tell whether it was no longer necessary or whether it was preventing them from adapting to a marine environment. We know that ancient marine environments didn’t have organophosphate pesticides, so we think the loss might instead be related to PON1’s role in responding to the extreme oxidative stress generated by long periods of diving and rapid resurfacing. If we can figure out why these species don’t have functional PON1, we might learn more about the function of PON1 in human health, while also uncovering potential clues to help protect marine mammals most at risk.”
As an example of the potential real-world consequences of losing function at PON1, the researchers explain in their scientific manuscript that in Florida, “agricultural use of organophosphate pesticides is common and runoff can drain into manatee habitats. In Brevard County, where 70 percent of Atlantic Coast manatees are estimated to migrate or seasonally reside, agricultural lands frequently abut manatee protection zones and waterways.”
The scientists believe the next step is to launch a study that directly observes marine mammals during and shortly after periods of excess agricultural organophosphate run-off. Such a project would require increased monitoring of marine mammal habitats, as well as testing of tissues from deceased marine mammals for evidence of organophosphate exposure. The most recent estimate the research team could find of organophosphate levels in manatee habitats in Florida is a decade old, Clark said.
“Marine mammals, such as manatees or bottlenose dolphins, are sentinel species — the canary in the coal mine,” said Clark. “If you follow their health, it will tell you a lot about potential environmental issues that could eventually affect humans.”
The first full characterization measurement of an accelerator beam in six dimensions will advance the understanding and performance of current and planned accelerators around the world.
A team of researchers led by the University of Tennessee, Knoxville conducted the measurement in a beam test facility at the Department of Energy’s Oak Ridge National Laboratory using a replica of the Spallation Neutron Source’s linear accelerator.
“Our goal is to better understand the physics of the beam so that we can improve how accelerators operate,” group leader in ORNL’s Research Accelerator Division and UT joint faculty professor. “Part of that is related to being able to fully characterize or measure a beam in 6D space — and that’s something that, until now, has never been done.”
Six-dimensional space is like 3D space but includes three additional coordinates on the x, y, and z axes to track motion or velocity.
“Right away we saw the beam has this complex structure in 6D space that you can’t see below 5D — layers and layers of complexities that can’t be detangled,” Cousineau said. “The measurement also revealed the beam structure is directly related to the beam’s intensity, which gets more complex as the intensity increases.”
Previous attempts to fully characterize an accelerator beam fell victim to “the curse of dimensionality,” in which measurements in low dimensions become exponentially more difficult in higher dimensions. Scientists have tried to circumvent the issue by adding three 2D measurements together to create a quasi-6D representation. The UT-ORNL team notes that approach is incomplete as a measurement of the beam’s initial conditions entering the accelerator, which determine beam behavior farther down the linac.
As part of efforts to boost the power output of SNS, ORNL physicists used the beam test facility to commission the new radio frequency quadrupole, the first accelerating element located at the linac’s front-end assembly. With the infrastructure already in place, a research grant from the National Science Foundation to the University of Tennessee enabled outfitting the beam test facility with the state-of-the-art 6D measurement capability. Conducting 6D measurements in an accelerator has been limited by the need for multiple days of beam time, which can be a challenge for production accelerators.
“Because we have a replica of the linac’s front-end assembly at the beam test facility, we don’t have to worry about interrupting users’ experiment cycles at SNS. That provides us with unfettered access to perform these time-consuming measurements, which is something we wouldn’t have at other facilities,” said lead author Brandon Cathey, a UT graduate student.
“This result shows the value of combining the freedom and ingenuity of NSF-funded academic research with facilities available through the broad national laboratory complex,” said Vyacheslav Lukin, the NSF program officer who oversees the grant to the University of Tennessee. “There is no better way to introduce a new scientist — a graduate student — to the modern scientific enterprise than by allowing them to lead a first-of-a-kind research project at a facility that uniquely can dissect the particles that underpin what we know and understand about matter and energy.”
The researchers’ ultimate goal is to model the entire beam, including mitigating so-called beam halo, or beam loss — when particles travel to the outer extremes of the beam and are lost. The more immediate challenge, they say, will be finding software tools capable of analyzing the roughly 5 million data points the 6D measurement generated during the 35-hour period.
“When we proposed making a 6D measurement 15 years ago, the problems associated with the curse of dimensionality seemed insurmountable,” said ORNL physicist and coauthor Alexander Aleksandrov. “Now that we’ve succeeded, we’re sure we can improve the system to make faster, higher resolution measurements, adding an almost ubiquitous technique to the arsenal of accelerator physicists everywhere.”
New research shows that for the vast majority of individuals, sodium consumption does not increase health risks except for those who eat more than five grams a day, the equivalent of 2.5 teaspoons of salt.
Fewer than five per cent of individuals in developed countries exceed that level.
The large, international study also shows that even for those individuals there is good news. Any health risk of sodium intake is virtually eliminated if people improve their diet quality by adding fruits, vegetables, dairy foods, potatoes, and other potassium rich foods.
The study followed 94,000 people, aged 35 to 70, for an average of eight years in communities from18 countries around the world and found there an associated risk of cardiovascular disease and strokes only where the average intake is greater than five grams of sodium a day.
China is the only country in their study where 80 per cent of communities have a sodium intake of more than five grams a day. In the other countries, the majority of the communities had an average sodium consumption of 3 to 5 grams a day (equivalent to 1.5 to 2.5 teaspoons of salt).
“The World Health Organization recommends consumption of less than two grams of sodium — that’s one teaspoon of salt — a day as a preventative measure against cardiovascular disease, but there is little evidence in terms of improved health outcomes that individuals ever achieve at such a low level,” said Andrew Mente, first author of the study and a PHRI researcher.
He added that the American Heart Association recommends even less — 1.5 grams of sodium a day for individuals at risk of heart disease.
“Only in the communities with the most sodium intake those over five grams a day of sodium which is mainly in China, did we find a direct link between sodium intake and major cardiovascular events like heart attack and stroke.
“In communities that consumed less than five grams of sodium a day, the opposite was the case. Sodium consumption was inversely associated with myocardial infarction or heart attacks and total mortality, and no increase in stroke.”
Mente added: “We found all major cardiovascular problems, including death, decreased in communities and countries where there is an increased consumption of potassium which is found in foods such as fruits, vegetables, dairy foods, potatoes and nuts and beans.”
The information for the research article came from the ongoing, international Prospective Urban Rural Epidemiology (PURE) study run by the PHRI. Mente is also an associate professor of the Department of Health Research Methods, Evidence and Impact at McMaster University.
Most previous studies relating sodium intake to heart disease and stroke were based on individual-level information, said Martin O’Donnell.
“Public health strategies should be based on best evidence. Our findings demonstrate that community-level interventions to reduce sodium intake should target communities with high sodium consumption, and should be embedded within approaches to improve overall dietary quality.
“There is no convincing evidence that people with moderate or average sodium intake need to reduce their sodium intake for prevention of heart disease and stroke.”
Injecting particles into the atmosphere to cool the planet and counter the warming effects of climate change would do nothing to offset the crop damage from rising global temperatures, according to a new analysis by University of California, Berkeley, researchers.
By analyzing the past effects of Earth-cooling volcanic eruptions, and the response of crops to changes in sunlight, the team concluded that any improvements in yield from cooler temperatures would be negated by lower productivity due to reduced sunlight. The findings have important implications for our understanding of solar geoengineering, one proposed method for helping humanity manage the impacts of global warming.
“Shading the planet keeps things cooler, which helps crops grow better. But plants also need sunlight to grow, so blocking sunlight can affect growth. For agriculture, the unintended impacts of solar geoengineering are equal in magnitude to the benefits,” said lead author Jonathan Proctor, a UC Berkeley doctoral candidate in the Department of Agricultural and Resource Economics. “It’s a bit like performing an experimental surgery; the side-effects of treatment appear to be as bad as the illness.”
“Unknown unknowns make everybody nervous when it comes to global policies, as they should,” said Solomon Hsiang, co-lead author of the study and Chancellor’s Associate Professor of Public Policy at UC Berkeley. “The problem in figuring out the consequences of solar geoengineering is that we can’t do a planetary-scale experiment without actually deploying the technology. The breakthrough here was realizing that we could learn something by studying the effects of giant volcanic eruptions that geoengineering tries to copy.”
Hsiang is director of UC Berkeley’s Global Policy Laboratory, where Proctor is a doctoral fellow.
Proctor and Hsiang will publish their findings online in the journal Nature on August 8.
Some people have pointed to past episodes of global cooling caused by gases emitted during massive volcanic eruptions, such as Mt. Pinatubo in the Philippines in 1991, and argued that humans could purposely inject sulfate aerosols into the upper atmosphere to artificially cool Earth and alleviate the greenhouse warming caused by increased levels of carbon dioxide. Aerosols — in this case, minute droplets of sulfuric acid — reflect a small percentage of sunlight back into space, reducing the temperature a few degrees.
“It’s like putting an umbrella over your head when you’re hot,” Proctor said. “If you put a global sunshade up, it would slow warming.”
Pinatubo, for example, injected about 20 million tons of sulfur dioxide into the atmosphere, reducing sunlight by about 2.5 percent and lowering the average global temperature by about half a degree Celsius (nearly 1 degree Fahrenheit).
The team linked maize, soy, rice and wheat production from 105 countries from 1979-2009 to global satellite observations of these aerosols to study their effect on agriculture. Pairing these results with global climate models, the team calculated that the loss of sunlight from a sulfate-based geoengineering program would cancel its intended benefits of protecting crops from damaging extreme heat.
“It’s similar to using one credit card to pay off another credit card: at the end of the day, you end up where you started without having solved the problem,” Hsiang said.
Some earlier studies suggested that aerosols might improve crop yields also by scattering sunlight and allowing more of the sun’s energy to reach interior leaves typically shaded by upper canopy leaves. This benefit of scattering appears to be weaker than previously thought.
“We are the first to use actual experimental and observational evidence to get at the total impacts that sulfate-based geoengineering might have on yields,” Proctor said. “Before I started the study, I thought the net impact of changes in sunlight would be positive, so I was quite surprised by the finding that scattering light decreases yields.”
Despite the study’s conclusions, Proctor said, “I don’t think we should necessarily write off solar geoengineering. For agriculture, it might not work that well, but there are other sectors of the economy that could potentially benefit substantially.”
Proctor and Hsiang noted that their methods could be used to investigate the impact of geoengineering on other segments of the economy, human health and the functioning of natural ecosystems.
They did not address other types of geoengineering, such as capture and storage of carbon dioxide, or issues surrounding geoengineering, such as its impact on Earth’s protective ozone layer and who gets to set Earth’s thermostat.
“Society needs to be objective about geoengineering technologies and develop a clear understanding of the potential benefits, costs and risks,” Proctor said. “At present, uncertainty about these factors dwarfs what we understand.”
The authors emphasize the need for more research into the human and ecological consequences of geoengineering, both good and bad.
“The most certain way to reduce damages to crops and, in turn, people’s livelihood and well-being, is reducing carbon emissions,” Proctor said.
“Perhaps what is most important is that we have respect for the potential scale, power and risks of geoengineering technologies,” Hsiang said. “Sunlight powers everything on the planet, so we must understand the possible outcomes if we are going to try to manage it.”
A new study led by scientists at the University of Bristol has warned that unless we mitigate current levels of carbon dioxide emissions, Western Europe and New Zealand could revert to the hot tropical climate of the early Paleogene period — 56-48 million years ago.
As seen from the ongoing heat wave, the knock-on effects of such extreme warmth include arid land and fires as well as impacts on health and infrastructure.
The early Paleogene is a period of great interest to climate change scientists as carbon dioxide levels (around 1,000 ppmv) are similar to those predicted for the end of this century.
Dr David Naafs from the University of Bristol’s School of Earth Sciences, “We know that the early Paleogene was characterised by a greenhouse climate with elevated carbon dioxide levels.
“Most of the existing estimates of temperatures from this period are from the ocean, not the land — what this study attempts to answer is exactly how warm it got on land during this period.”
Scientists used molecular fossils of microorganisms in ancient peat (lignite) to provide estimates of land temperature 50 million-years ago. This demonstrated that annual land temperatures in Western Europe as well as New Zealand were actually higher than previously thought — between 23 and 29 °C — this is currently 10 to 15 °C higher than current average temperatures in these areas.
These results suggest that temperatures similar to those of the current heat wave that is influencing western Europe and other regions would become the new norm by the end of this century if CO2 levels in the atmosphere continue to increase.
Professor Rich Pancost, Co-author and Director of the University of Bristol Cabot Institute, added: “Our work adds to the evidence for a very hot climate under potential end-of-century carbon dioxide levels. “Importantly, we also study how the Earth system responded to that warmth. For example, this and other hot time periods were associated with evidence for arid conditions and extreme rainfall events.”
The research team will now turn their attentions to geographical areas in lower-latitudes to see how hot land temperatures were there.
Dr Naafs said: “Did the tropics, for example, become ecological dead zones because temperatures in excess of 40 °C were too high for most form of life to survive?
“Some climate models suggest this, but we currently lack critical data.
“Our results hint at the possibility that the tropics, like the mid-latitudes, were hotter than present, but more work is needed to quantify temperatures from these regions.”
Obscured by thick clouds of absorbing dust, the closest supermassive black hole to the Earth lies 26,000 light years away at the centre of the Milky Way. This gravity monster, which has a mass four million times that of the Sun, is surrounded by a small group of stars orbiting at high speed. This extreme environment — the strongest gravitational field in our galaxy — makes it the perfect place to test gravitational physics, particularly Einstein’s general theory of relativity.
New infrared observations from the exquisitely sensitive GRAVITY, NACO and SINFONI instruments on ESO’s Very Large Telescope (VLT) have now allowed astronomers to follow one of these stars, called S2, as it passed very close to the black hole during May 2018 at a speed in excess of 25 million kilometres per hour — three percent of the speed of light — and at a distance of less than 20 billion kilometres.
These extremely delicate measurements were made by an international team led by Reinhard Genzel of the Max Planck Institute for extraterrestrial physics (MPE) in Garching, Germany, in conjunction with collaborators around the world. The observations form the culmination of a 26-year series of ever more precise observations of the centre of the Milky Way using ESO instruments. ‘This is the second time that we have observed the close passage of S2 around the black hole in our galactic centre. But this time, because of much improved instrumentation, we were able to observe the star with unprecedented resolution’, explains Genzel. ‘We have been preparing intensely for this event over several years, as we wanted to make the most of this unique opportunity to observe general relativistic effects.’
The new measurements clearly reveal an effect called gravitational redshift. Light from the star is stretched to longer wavelengths by the very strong gravitational field of the black hole. And the stretch in wavelength of light from S2 agrees precisely with that predicted by Einstein’s theory of general relativity. This is the first time that this deviation from the predictions of simpler Newtonian gravity has been observed in the motion of a star around a supermassive black hole. The team used SINFONI to measure the motion of S2 towards and away from Earth and the GRAVITY interferometric instrument to make extraordinarily precise measurements of the position of S2 in order to define the shape of its orbit. GRAVITY creates such sharp images that it can reveal the motion of the star from night to night as it passes close to the black hole — 26,000 light years from Earth.
‘Our first observations of S2, about two years ago, already showed that we would have the ideal black hole laboratory’, adds Frank Eisenhauer (MPE), Co-Principal Investigator of the GRAVITY instrument. ‘During the close passage, we managed not only to precisely follow the star on its orbit, we could even detect the faint glow around the black hole on most of the images.’ By combining the position and velocity measurements from SINFONI and GRAVITY, as well as previous observations using other instruments, the team could compare them to the predictions of Newtonian gravity, general relativity and other theories of gravity. As expected, the new results are inconsistent with Newtonian predictions and in excellent agreement with the predictions of general relativity. More than one hundred years after he published his paper setting out the equations of general relativity, Einstein has been proved right once more.
The hardware contribution of the Institute of Physics I of the University of Cologne was the development and construction of the two spectrometers of GRAVITY. The spectrometers analyse the wavelength of the observed stellar light and convert the received photons into electronic signals. ‘GRAVITY is a technological challenge. However, after more than two decades of astrophysical research on the high velocity stars in the Galactic Centre and on the development of astronomical instrumentation, the effort has been rewarded with an excellent result in experimental physics’, says Andreas Eckhart from the University of Cologne.
Continuing observations are expected to reveal another relativistic effect later in the year a small rotation of the star’s orbit, known as Schwarzschild precession as S2 moves away from the black hole.
There may be more habitable planets in the universe than we previously thought, who suggest that plate tectonics — long assumed to be a requirement for suitable conditions for life — are in fact not necessary.
When searching for habitable planets or life on other planets, scientists look for biosignatures of atmospheric carbon dioxide. On Earth, atmospheric carbon dioxide increases surface heat through the greenhouse effect. Carbon also cycles to the subsurface and back to the atmosphere through natural processes.
“Volcanism releases gases into the atmosphere, and then through weathering, carbon dioxide is pulled from the atmosphere and sequestered into surface rocks and sediment,” said Bradford Foley, assistant professor of geosciences. “Balancing those two processes keeps carbon dioxide at a certain level in the atmosphere, which is really important for whether the climate stays temperate and suitable for life.”
Most of Earth’s volcanoes are found at the border of tectonic plates, which is one reason scientists believed they were necessary for life. Subduction, in which one plate is pushed deeper into the subsurface by a colliding plate, can also aid in carbon cycling by pushing carbon into the mantle.
Planets without tectonic plates are known as stagnant lid planets. On these planets, the crust is one giant, spherical plate floating on mantle, rather than separate pieces. These are thought to be more widespread than planets with plate tectonics. In fact, Earth is the only planet with confirmed tectonic plates.
Foley and Andrew Smye, assistant professor of geosciences, created a computer model of the lifecycle of a planet. They looked at how much heat its climate could retain based on its initial heat budget, or the amount of heat and heat-producing elements present when a planet forms. Some elements produce heat when they decay. On Earth, decaying uranium produces thorium and heat, and decaying thorium produces potassium and heat.
After running hundreds of simulations to vary a planet’s size and chemical composition, the researchers found that stagnant lid planets can sustain conditions for liquid water for billions of years. At the highest extreme, they could sustain life for up to 4 billion years, roughly Earth’s life span to date.
“You still have volcanism on stagnant lid planets, but it’s much shorter lived than on planets with plate tectonics because there isn’t as much cycling,” said Smye. “Volcanoes result in a succession of lava flows, which are buried like layers of a cake over time. Rocks and sediment heat up more the deeper they are buried.”
The researchers found that at high enough heat and pressure, carbon dioxide gas can escape from rocks and make its way to the surface, a process known as degassing. On Earth, Smye said, the same process occurs with water in subduction fault zones.
This degassing process increases based on what types and quantities of heat-producing elements are present in a planet up to a certain point, said Foley.
“There’s a sweet spot range where a planet is releasing enough carbon dioxide to keep the planet from freezing over, but not so much that the weathering can’t pull carbon dioxide out of the atmosphere and keep the climate temperate,” he said.
According to the researchers’ model, the presence and amount of heat-producing elements were far better indicators for a planet’s potential to sustain life.
“One interesting take-home point of this study is that the initial composition or size of a planet is important in setting the trajectory for habitability.” “The future fate of a planet is set from the outset of its birth.”
The incidence of tickborne infections in the United States has risen significantly within the past decade. It is imperative, therefore, that public health officials and scientists build a robust understanding of pathogenesis, design improved diagnostics, and develop preventive vaccines.
Bacteria cause most tickborne diseases in the United States, with Lyme disease representing the majority (82 percent) of reported cases. The spirochete Borrelia burgdorferi is the primary cause of Lyme disease in North America; it is carried by hard-bodied ticks that then feed on smaller mammals, such as white-footed mice, and larger animals, such as white-tailed deer. Although there are likely many factors contributing to increased Lyme disease incidence in the U.S., greater tick densities and their expanding geographical range have played a key role, the authors write. For example, the Ixodes scapularis tick, which is the primary source of Lyme disease in the northeastern U.S., had been detected in nearly 50 percent more counties by 2015 than was previously reported in 1996. Although most cases of Lyme disease are successfully treated with antibiotics, 10 to 20 percent of patients report lingering symptoms after effective antimicrobial therapy. Scientists need to better understand this lingering morbidity, note the authors.
Tickborne virus infections are also increasing and could cause serious illness and death. For example, Powassan virus (POWV), recognized in 1958, causes a febrile illness that can be followed by progressive and severe neurologic conditions, resulting in death in 10 to 15 percent of cases and long-term symptoms in as many as 70 percent of survivors. Only 20 U.S. cases of POWV infection were reported before 2006; 99 cases were reported between 2006 and 2016.
The public health burden of tickborne disease is considerably underreported, according to the authors. For example, the U.S. Centers for Disease Control and Prevention (CDC) reports approximately 30,000 cases of Lyme disease annually in the U.S. but estimates that the true incidence is 10 times that number. According to the authors, this is due in part to the limitations of current tickborne disease surveillance, as well as current diagnostics, which may be imprecise in some cases and are unable to recognize new tickborne pathogens as they emerge. These limitations have led researchers to explore new, innovative diagnostics with different platforms that may provide clinical benefit in the future.
It is also critical that scientists develop vaccines to prevent disease, the authors write. A vaccine to protect against Lyme disease was previously developed, but was pulled from the market and is no longer available. Future protective measures could include vaccines specifically designed to create an immune response to a pathogen, or to target pathogens inside the ticks that carry them.
By focusing research on the epidemiology of tickborne diseases, improving diagnostics, finding new treatments and developing preventive vaccines, public health officials and researchers may be able to stem the growing threat these diseases pose. In the meantime, the authors suggest, healthcare providers should advise their patients to use basic prevention techniques: wear insect repellant, wear long pants when walking in the woods or working outdoors, and check for ticks.
A team of scientists from the University of California, Irvine has found evidence of significant mass loss in East Antarctica’s Totten and Moscow University glaciers, which, if they fully collapsed, could add 5 meters (16.4 feet) to the global sea level.
The glaciologists estimate that between April 2002 and September 2016, the two glaciers lost about 18.5 billion tons of ice per year — equivalent to 0.7 millimeters (0.03 inches) of global sea level rise over the analyzed time period.
UCI’s researchers discovered this by applying a locally optimized technique to data from NASA’s Gravity Recovery & Climate Experiment satellite mission, combined with mass balance approximations from regional atmospheric climate models and ice discharge measurements by NASA’s Operation IceBridge and Measures projects.
“For this research, we used an improved methodology with GRACE data to retrieve the mass loss in an area undergoing rapid change,” a graduate student in UCI’s Department of Earth System Science. “By overlaying these data with independent measurements, we improve our confidence in the results and the conclusion that Totten and Moscow University are imperiled.”
Making up roughly two-thirds of the Antarctic continent, East Antarctica has been viewed by polar researchers as less threatened by climate change than the volatile ice sheets in West Antarctica and the Antarctic Peninsula.
“Both of these glaciers are vulnerable to the intrusion of warm ocean water and hold considerable potential for sea level rise,” said co-author Eric Rignot, Donald Bren Professor and chair of Earth system science at UCI. “This work highlights that East Antarctic glaciers are as important to our future as those in the continent’s western regions.”
According to co-author Isabella Velicogna, professor of Earth system science, it’s challenging to study the Totten and Moscow University glaciers because the signal of change is much weaker than that of their counterparts in the west.
“In this remote part of the world, the data from GRACE and other satellite missions are critical for us to understand the glacier evolution.”
During the Late Pleistocene period (between 125,000 to 12,000 years ago) two bear species roamed Europe: omnivorous brown bears and the extinct mostly vegetarian cave bear.
Until now, very little is known about the dietary evolution of the cave bear and how it became a vegetarian, as the fossils of the direct ancestor, the Deninger’s bear , are extremely scarce.
However, sheds new light on this. A research team from Germany and Spain found that Deninger’s bear likely had a similar diet to its descendant the classic cave bear as new analysis shows a distinct morphology in the cranium, mandible and teeth, which has been related to its dietary specialization of a larger consumption of vegetal matter.
To understand the evolution of the cave bear lineage, the researchers micro-CT scanned the rare fossils and digitally removed the sediments so as not to risk damaging the fossils. Using sophisticated statistical methods, called geometric morphometrics, the researchers compared the three-dimensional shape of the mandibles and skull of Deninger’s bear with that of classic cave bears and modern bears.
“The analyses showed that Deninger’s bear had very similarly shaped mandibles and skull to the classic cave bear,” This implies that they were adapted to the same food types and were primarily vegetarian.
“There is an ongoing discussion on the extent to which the classic cave bear was a vegetarian. And, this is especially why the new information on the diet of its direct ancestor is so important, because it teaches us that a differentiation between the diet of cave bears and brown bears was already established by 500 thousand years ago and likely earlier,” doctoral candidate at the Universities of the Basque Country and Bordeaux and co-author of the study.
Interestingly, researchers also found there are shape differences between the Deninger’s bears from the Iberian Peninsula and those from the rest of Europe, which are unlikely to be related to diet.
They have come up with three possibilities to explain these differences: 1) the Iberian bears are chronologically younger than the rest, 2) the Pyrenees, acting as natural barrier, resulted in some genetic differentiation between the Iberian bears and those from the rest of Europe, 3) there were multiple lineages, with either just one leading to the classic cave bear, or each lineage leading to a different group of cave bears.