A global shift towards healthy and more plant-based diets, halving food loss and waste, and improving farming practices and technologies are required to feed 10 billion people sustainably by 2050, a new study finds. Adopting these options reduces the risk of crossing global environmental limits related to climate change, the use of agricultural land, the extraction of freshwater resources, and the pollution of ecosystems through overapplication of fertilizers, according to the researchers.
The study, is the first to quantify how food production and consumption affects the planetary boundaries that describe a safe operating space for humanity beyond which Earth’s vital systems could become unstable.
“No single solution is enough to avoid crossing planetary boundaries. But when the solutions are implemented together, our research indicates that it may be possible to feed the growing population sustainably,” says Dr Marco Springmann of the Oxford Martin Programme on the Future of Food and the Nuffield Department of Population Health at the University of Oxford, who led the study.
“Without concerted action, we found that the environmental impacts of the food system could increase by 50-90% by 2050 as a result of population growth and the rise of diets high in fats, sugars and meat. In that case, all planetary boundaries related to food production would be surpassed, some of them by more than twofold.”
The study, funded by EAT as part of the EAT-Lancet Commission for Food, Planet and Health and by Wellcome’s “Our Planet, Our Health” partnership on Livestock Environment and People, combined detailed environmental accounts with a model of the global food system that tracks the production and consumption of food across the world. With this model, the researchers analysed several options that could keep the food system within environmental limits. They found:
- Climate change cannot be sufficiently mitigated without dietary changes towards more plant-based diets. Adopting more plant-based “flexitarian” diets globally could reduce greenhouse gas emissions by more than half, and also reduce other environmental impacts, such as fertilizer application and the use of cropland and freshwater, by a tenth to a quarter.
- In addition to dietary changes, improving management practices and technologies in agriculture is required to limit pressures on agricultural land, freshwater extraction, and fertilizer use. Increasing agricultural yields from existing cropland, balancing application and recycling of fertilizers, and improving water management, could, along with other measures, reduce those impacts by around half.
- Finally, halving food loss and waste is needed for keeping the food system within environmental limits. Halving food loss and waste could, if globally achieved, reduce environmental impacts by up to a sixth (16%).
“Many of the solutions we analysed are being implemented in some parts of the world, but it will need strong global co-ordination and rapid upscale to make their effects felt.”
“Improving farming technologies and management practices will require increasing investment in research and public infrastructure, the right incentive schemes for farmers, including support mechanisms to adopt best available practices, and better regulation, for example of fertilizer use and water quality,” says Line Gordon, executive director of the Stockholm Resilience Centre and an author on the report.
“Tackling food loss and waste will require measures across the entire food chain, from storage, and transport, over food packaging and labelling to changes in legislation and business behaviour that promote zero-waste supply chains.”
“When it comes to diets, comprehensive policy and business approaches are essential to make dietary changes towards healthy and more plant-based diets possible and attractive for a large number of people. Important aspects include school and workplace programmes, economic incentives and labelling, and aligning national dietary guidelines with the current scientific evidence on healthy eating and the environmental impacts of our diet,” adds Springmann.
All large-scale energy systems have environmental impacts, and the ability to compare the impacts of renewable energy sources is an important step in planning a future without coal or gas power. Extracting energy from the wind causes climatic impacts that are small compared to current projections of 21st century warming, but large compared to the effect of reducing US electricity emissions to zero with solar. Research publishing in the journal Joule on October 4 reports the most accurate modelling yet of how increasing wind power would affect climate, finding that large-scale wind power generation would warm the Continental United States 0.24 degrees Celsius because wind turbines redistribute heat in the atmosphere.
“Wind beats coal by any environmental measure, but that doesn’t mean that its impacts are negligible,” says senior author David Keith, an engineering and public policy professor at Harvard University. “We must quickly transition away from fossil fuels to stop carbon emissions. In doing so, we must make choices between various low-carbon technologies, all of which have some social and environmental impacts.”
“Wind turbines generate electricity but also alter the atmospheric flow,” says first author Lee Miller. “Those effects redistribute heat and moisture in the atmosphere, which impacts climate. We attempted to model these effects on a continental scale.”
To compare the impacts of wind and solar, Keith and Miller started by establishing a baseline for the 2012-2014 US climate using a standard weather forecasting model. Then they added in the effect on the atmosphere of covering one third of the Continental US with enough wind turbines to meet present-day US electricity demand. This is a relevant scenario if wind power plays a major role in decarbonizing the energy system in the latter half of this century. This scenario would warm the surface temperature of the Continental US by 0.24 degrees Celsius.
Their analysis focused on the comparison of climate impacts and benefits. They found that it would take about a century to offset that effect with wind-related reductions in greenhouse gas concentrations. This timescale was roughly independent of the specific choice of total wind power generation in their scenarios.
“The direct climate impacts of wind power are instant, while the benefits accumulate slowly,” says Keith. “If your perspective is the next 10 years, wind power actually has — in some respects — more climate impact than coal or gas. If your perspective is the next thousand years, then wind power is enormously cleaner than coal or gas.”
More than ten previous studies have now observed local warming caused by US wind farms. Keith and Miller compared their simulated warming to observations and found rough consistency between the observations and model.
They also compared wind power’s impacts with previous projections of solar power’s influence on the climate. They found that, for the same energy generation rate, solar power’s impacts would be about 10 times smaller than wind. But both sources of energy have their pros and cons.
“In terms of temperature difference per unit of energy generation, solar power has about 10 times less impact than wind,” says Miller. “But there are other considerations. For example, solar farms are dense, whereas the land between wind turbines can be co-utilized for agriculture.” The density of wind turbines and the time of day during which they operate can also influence the climatic impacts.
Keith and Miller’s simulations do not consider any impacts on global-scale meteorology, so it remains somewhat uncertain how such a deployment of wind power may affect the climate in other countries.
“The work should not be seen as a fundamental critique of wind power. Some of wind’s climate impacts may be beneficial. So rather, the work should be seen as a first step in getting more serious about assessing these impacts,” says Keith. “Our hope is that our study, combined with the recent direct observations, marks a turning point where wind power’s climatic impacts begin to receive serious consideration in strategic decisions about decarbonizing the energy system.”
Human evolution used to be depicted as a straight line, gradually progressing from an ape-like ancestor to modern Homo sapiens. But thanks to next-generation sequencing — as well as the discovery of genetic material from extinct subspecies of early humans — findings in recent years have shown that it wasn’t quite so orderly. The human family tree is full of twists and branches that helped shape what we are today. Now, a study published in the journal Cell is reporting new details about the role of viruses in shaping evolution, in particular viral interactions between modern humans and Neanderthals.
“It’s not a stretch to imagine that when modern humans met up with Neanderthals, they infected each other with pathogens that came from their respective environments,” “By interbreeding with each other, they also passed along genetic adaptations to cope with some of those pathogens.”
Current thinking is that modern humans began moving out of Africa and into Eurasia about 70,000 years ago. When they arrived, they met up with Neanderthals who, along with their own ancestors, had been adapting to that geographic area for hundreds of thousands of years. The Eurasian environment shaped Neanderthals’ evolution, including the development of adaptations to viruses and other pathogens that were present there but not in Africa.
The Cell study provides new details about the role of adaptive introgression, or hybridization between species, in human evolution. “Some of the Neanderthals had adaptive mutations that gave them advantages against these pathogens, and they were able to pass some of these mutations on to modern humans,” explains Enard, who completed the work while he was a postdoctoral researcher at Stanford University. “That’s called positive natural selection — it favors certain individuals that carry these advantageous mutations.”
Their earlier research focused on how viruses impacted the evolution of humans. In 2016, they reported that about one-third of protein adaptations since humans split from other great apes was driven by a response to infectious viruses. The new work built on those findings looked at which of those adaptations may have come from Neanderthals.
In the current study, the investigators annotated thousands of genes in the human genome that are known to interact with pathogens — more than 4,000 of the 25,000 total genes. “We focused on these genes because the ones that interact with viruses are much more likely to have been involved in adaptation against infectious disease compared with genes that don’t have anything to do with viruses.”
They then looked at whether there was an enrichment of stretches of Neanderthal DNA in those 4,000 genes. Earlier studies from other groups have shown that Neanderthal DNA is present in humans. Those sequences are publicly available to investigators in the field. Based on the analysis, Enard and Petrov found strong evidence that adaptive genes that provided resistance against viruses were shared between Neanderthals and modern humans.
“Many Neanderthal sequences have been lost in modern humans, but some stayed and appear to have quickly increased to high frequencies at the time of contact, suggestive of their selective benefits at that time,” Petrov says. “Our research aims to understand why that was the case. We believe that resistance to specific RNA viruses provided by these Neanderthal sequences was likely a big part of the reason for their selective benefits.”
“One of the things that population geneticists have wondered about is why we have maintained these stretches of Neanderthal DNA in our own genomes,” Enard adds. “This study suggests that one of the roles of those genes was to provide us with some protection against pathogens as we moved into new environments.”
Researchers from Yale-NUS College and the University of Fribourg in Switzerland have discovered a novel colour-generation mechanism in nature, which if harnessed, has the potential to create cosmetics and paints with purer and more vivid hues, screen displays that project the same true image when viewed from any angle, and even reduce the signal loss in optical fibres. Dr Saranathan examined the rainbow-coloured patterns in the elytra (wing casings) of a snout weevil from the Philippines, Pachyrrhynchus congestus pavonius, using high-energy X-rays, while Dr Wilts performed detailed scanning electron microscopy and optical modelling. They discovered that to produce the rainbow palette of colours, the weevil utilised a colour-generation mechanism that is so far found only in squid, cuttlefish, and octopuses, which are renowned for their colour-shifting camouflage. The study was published in the peer-reviewed journal Small.
P. c. pavonius, or the “Rainbow” Weevil, is distinctive for its rainbow-coloured spots on its thorax and elytra. These spots are made up of nearly-circular scales arranged in concentric rings of different hues, ranging from blue in the centre to red at the outside, just like a rainbow. While many insects have the ability to produce one or two colours, it is rare that a single insect can produce such a vast spectrum of colours. Researchers are interested to figure out the mechanism behind the natural formation of these colour-generating structures, as current technology is unable to synthesise structures of this size.
“The ultimate aim of research in this field is to figure out how the weevil self-assembles these structures, because with our current technology we are unable to do so,” Dr Saranathan said. “The ability to produce these structures, which are able to provide a high colour fidelity regardless of the angle you view it from, will have applications in any industry which deals with colour production. We can use these structures in cosmetics and other pigmentations to ensure high-fidelity hues, or in digital displays in your phone or tablet which will allow you to view it from any angle and see the same true image without any colour distortion. We can even use them to make reflective cladding for optical fibres to minimise signal loss during transmission.”
Dr Saranathan and Dr Wilts examined these scales to determine that the scales were composed of a three-dimensional crystalline structure made from chitin (the main ingredient in insect exoskeletons). They discovered that the vibrant rainbow colours on this weevil’s scales are determined by two factors: the size of the crystal structure which makes up each scale, as well as the volume of chitin used to make up the crystal structure. Larger scales have a larger crystalline structure and use a larger volume of chitin to reflect red light; smaller scales have a smaller crystalline structure and use a smaller volume of chitin to reflect blue light. According to Dr Saranathan, who previously examined over 100 species of insects and spiders and catalogued their colour-generation mechanisms, this ability to simultaneously control both size and volume factors to fine-tune the colour produced has never before been shown in insects, and given its complexity, is quite remarkable. “It is different from the usual strategy employed by nature to produce various different hues on the same animal, where the chitin structures are of fixed size and volume, and different colours are generated by orienting the structure at different angles, which reflects different wavelengths of light,” Dr Saranathan explained.
An international team of researchers has proposed a new method to investigate the inner workings of supernovae explosions. This new method uses meteorites and is unique in that it can determine the contribution from electron anti-neutrinos, enigmatic particles which can’t be tracked through other means.
Supernovae are important events in the evolution of stars and galaxies, but the details of how the explosions occur are still unknown. By measuring the amount of 98Ru (an isotope of Ruthenium) in meteorites, it should be possible to estimate how much of its progenitor 98Tc (a short-lived isotope of Technetium) was present in the material from which the Solar System formed. The amount of 98Tc in turn is sensitive to the characteristics, such as temperature, of electron anti-neutrinos in the supernova process; as well as to how much time passed between the supernova and the formation of the Solar System. The expected traces of 98Tc are only a little below the smallest currently detectable levels, raising hopes that they will be measured in the near future.
“There are six neutrino species. Previous studies have shown that neutrino-isotopes are predominantly produced by the five neutrino species other than the electron anti-neutrino. By finding a neutrino-isotope synthesized predominantly by the electron anti-neutrino, we can estimate the temperatures of all six neutrino species, which are important for understanding the supernova explosion mechanism.”
At the end of its life, a massive star dies in a fiery explosion known as a supernova. This explosion blasts most of the mass in the star out into outer space. That mass is then recycled into new stars and planets, leaving distinct chemical signatures which tell scientists about the supernova. Meteorites, sometimes called falling stars, formed from material left over from the birth of the Solar System, thus preserving the original chemical signatures.
An international team, including researchers at Stony Brook University and the Max Planck Institute for the Science of Human History, has found the earliest and largest monumental cemetery in eastern Africa. The Lothagam North Pillar Site was built 5,000 years ago by early pastoralists living around Lake Turkana, Kenya. This group is believed to have had an egalitarian society, without a stratified social hierarchy. Thus their construction of such a large public project contradicts long-standing narratives about early complex societies, which suggest that a stratified social structure is necessary to enable the construction of large public buildings or monuments.
The Lothagam North Pillar Site was a communal cemetery constructed and used over a period of several centuries, between about 5,000 and 4,300 years ago. Early herders built a platform approximately 30 meters in diameter and excavated a large cavity in the center to bury their dead. After the cavity was filled and capped with stones, the builders placed large, megalith pillars, some sourced from as much as a kilometer away, on top. Stone circles and cairns were added nearby. An estimated minimum of 580 individuals were densely buried within the central platform cavity of the site. Men, women, and children of different ages, from infants to the elderly, were all buried in the same area, without any particular burials being singled out with special treatment. Additionally, essentially all individuals were buried with personal ornaments and the distribution of ornaments was approximately equal throughout the cemetery. These factors indicate a relatively egalitarian society without strong social stratification.
Historically, archeologists have theorized that people built permanent monuments as reminders of shared history, ideals and culture, when they had established a settled, socially stratified agriculture society with abundant resources and strong leadership. It was believed that a political structure and the resources for specialization were prerequisites to engaging in monument building. Ancient monuments have thus previously been regarded as reliable indicators of complex societies with differentiated social classes. However, the Lothagam North cemetery was constructed by mobile pastoralists who show no evidence of a rigid social hierarchy. “This discovery challenges earlier ideas about monumentality,” explains Elizabeth Sawchuk of Stony Brook University and the Max Planck Institute for the Science of Human History. “Absent other evidence, Lothagam North provides an example of monumentality that is not demonstrably linked to the emergence of hierarchy, forcing us to consider other narratives of social change.”
The discovery is consistent with similar examples elsewhere in Africa and on other continents in which large, monumental structures have been built by groups thought to be egalitarian in their social organization. This research has the potential to reshape global perspectives on how — and why — large groups of people come together to form complex societies. In this case, it appears that Lothagam North was built during a period of profound change. Pastoralism had just been introduced to the Turkana Basin and newcomers arriving with sheep, goats, and cattle would have encountered diverse groups of fisher-hunter-gatherers already living around the lake. Additionally, newcomers and locals faced a difficult environmental situation, as annual rainfall decreased during this period and Lake Turkana shrunk by as much as fifty percent. Early herders may have constructed the cemetery as a place for people to come together to form and maintain social networks to cope with major economic and environmental change.
“The monuments may have served as a place for people to congregate, renew social ties, and reinforce community identity,” states Anneke Janzen also of the Max Planck Institute for the Science of Human History. “Information exchange and interaction through shared ritual may have helped mobile herders navigate a rapidly changing physical landscape.” After several centuries, pastoralism became entrenched and lake levels stabilized. It was around this time that the cemetery ceased to be used.
“The Lothagam North Pillar Site is the earliest known monumental site in eastern Africa, built by the region’s first herders,” states Hildebrand. “This finding makes us reconsider how we define social complexity, and the kinds of motives that lead groups of people to create public architecture.”
Researchers funded by the National Eye Institute (NEI) have reversed congenital blindness in mice by changing supportive cells in the retina called Müller glia into rod photoreceptors. The findings advance efforts toward regenerative therapies for blinding diseases such as age-related macular degeneration and retinitis pigmentosa.
“This is the first report of scientists reprogramming Müller glia to become functional rod photoreceptors in the mammalian retina.” “Rods allow us to see in low light, but they may also help preserve cone photoreceptors, which are important for color vision and high visual acuity. Cones tend to die in later-stage eye diseases. If rods can be regenerated from inside the eye, this might be a strategy for treating diseases of the eye that affect photoreceptors.”
Photoreceptors are light-sensitive cells in the retina in the back of the eye that signal the brain when activated. In mammals, including mice and humans, photoreceptors fail to regenerate on their own. Like most neurons, once mature they don’t divide.
Scientists have long studied the regenerative potential of Müller glia because in other species, such as zebrafish, they divide in response to injury and can turn into photoreceptors and other retinal neurons. The zebrafish can thus regain vision after severe retinal injury. In the lab, however, scientists can coax mammalian Müller glia to behave more like they do in the fish. But it requires injuring the tissue.
“From a practical standpoint, if you’re trying to regenerate the retina to restore a person’s vision, it is counterproductive to injure it first to activate the Müller glia.”
“We wanted to see if we could program Müller glia to become rod photoreceptors in a living mouse without having to injure its retina.”
In the first phase of a two-stage reprogramming process Chen’s team spurred Müller glia in normal mice to divide by injecting their eyes with a gene to turn on a protein called beta-catenin. Weeks later, they injected the mice’s eyes with factors that encouraged the newly divided cells to develop into rod photoreceptors.
The researchers used microscopy to visually track the newly formed cells. They found that the newly formed rod photoreceptors looked structurally no different from real photoreceptors. In addition, synaptic structures that allow the rods to communicate with other types of neurons within the retina had also formed. To determine whether the Müller glia-derived rod photoreceptors were functional, they tested the treatment in mice with congenital blindness, which meant that they were born without functional rod photoreceptors.
In the treated mice that were born blind, Müller glia-derived rods developed just as effectively as they had in normal mice. Functionally, they confirmed that the newly formed rods were communicating with other types of retinal neurons across synapses. Furthermore, light responses recorded from retinal ganglion cells — neurons that carry signals from photoreceptors to the brain — and measurements of brain activity confirmed that the newly-formed rods were in fact integrating in the visual pathway circuitry, from the retina to the primary visual cortex in the brain.
Chen’s lab is conducting behavioral studies to determine whether the mice have regained the ability to perform visual tasks such as a water maze task. Chen also plans to see if the technique works on cultured human retinal tissue.
New archaeological research from The Australian National University (ANU) has found that Homo erectus, an extinct species of primitive humans, went extinct in part because they were ‘lazy’.
An archaeological excavation of ancient human populations in the Arabian Peninsula during the Early Stone Age, found that Homo erectus used ‘least-effort strategies’ for tool making and collecting resources.
This ‘laziness’ paired with an inability to adapt to a changing climate likely played a role in the species going extinct.
“They really don’t seem to have been pushing themselves.”
“I don’t get the sense they were explorers looking over the horizon. They didn’t have that same sense of wonder that we have.”
Dr Shipton said this was evident in the way the species made their stone tools and collected resources.
“To make their stone tools they would use whatever rocks they could find lying around their camp, which were mostly of comparatively low quality to what later stone tool makers used.”
“At the site we looked at there was a big rocky outcrop of quality stone just a short distance away up a small hill.
“But rather than walk up the hill they would just use whatever bits had rolled down and were lying at the bottom.
“When we looked at the rocky outcrop there were no signs of any activity, no artefacts and no quarrying of the stone.
“They knew it was there, but because they had enough adequate resources they seem to have thought, ‘why bother?’.”
This is in contrast to the stone tool makers of later periods, including early Homo sapiens and Neanderthals, who were climbing mountains to find good quality stone and transporting it over long distances.
Dr Shipton said a failure to progress technologically, as their environment dried out into a desert, also contributed to the population’s demise.
“Not only were they lazy, but they were also very conservative,” Dr Shipton said.
“The sediment samples showed the environment around them was changing, but they were doing the exact same things with their tools.
“There was no progression at all, and their tools are never very far from these now dry river beds. I think in the end the environment just got too dry for them.”
The excavation and survey work was undertaken in 2014 at the site of Saffaqah near Dawadmi in central Saudi Arabia.
Step aside carrots, onions and broccoli. The newest heart-healthy vegetable could be a gigantic, record-setting radish. scientists report that compounds found in the Sakurajima Daikon, or “monster,” radish could help protect coronary blood vessels and potentially prevent heart disease and stroke. The finding could lead to the discovery of similar substances in other vegetables and perhaps lead to new drug treatments.
Grown for centuries in Japan, the Sakurajima Daikon is one of the Earth’s most massive vegetables. In 2003, the Guinness Book of World Records certified a Sakurajima weighing nearly 69 pounds as the world’s heaviest radish. Radishes are good sources of antioxidants and reportedly can reduce high blood pressure and the threat of clots, a pair of risk factors for heart attack and stroke. But to date, no studies have directly compared the heart-health benefits of the Sakurajima Daikon to other radishes. To address this knowledge gap, Katsuko Kajiya and colleagues sought to find out what effects this radish would have on nitric oxide production, a key regulator of coronary blood vessel function, and to determine its underlying mechanisms.
The researchers exposed human and pig vascular endothelial cells to extracts from Sakurajima Daikon and smaller radishes. Using fluorescence microscopy and other analytical techniques, the research team found the Sakurajima Daikon radish induced more nitric oxide production in these vascular cells than a smaller Japanese radish. They also identified trigonelline, a plant hormone, as the active component in Sakurajima Daikon that appears to promote a cascade of changes in coronary blood vessels resulting improved nitric oxide production.
glow are unique among bioluminescent animals and entirely unlike those seen in fireflies. The study also examines genes associated with some of the dramatic — and reversible — changes that happen to the fireworms during reproduction.
The beautiful bioluminescence of the Bermuda fireworm , which lives throughout the Caribbean, was first documented in 1492 by Christopher Columbus and his crew just before landing in the Americas. The observations described the lights as “looking like the flame of a small candle alternately raised and lowered.”
The phenomenon went unexplained until the 1930s, when scientists matched the historic description with the unusual and precisely timed mating behavior of fireworms. During summer and autumn, beginning at 22 minutes after sunset on the third night after the full Moon, spawning female fireworms secrete a bright bluish-green luminescence that attracts males. “It’s like they have pocket watches,” said lead author Mercer R. Brugler, a Museum research associate and assistant professor at New York City College of Technology (City Tech).
“The female worms come up from the bottom and swim quickly in tight little circles as they glow, which looks like a field of little cerulean stars across the surface of jet black water,” said Mark Siddall, a curator in the American Museum of Natural History’s Division of Invertebrate Zoology and corresponding author of the study. “Then the males, homing in on the light of the females, come streaking up from the bottom like comets — they luminesce, too. There’s a little explosion of light as both dump their gametes in the water. It is by far the most beautiful biological display I have ever witnessed.”
To further investigate this phenomenon, Siddall, together with Brugler; Michael Tessler, a postdoctoral fellow in the Museum’s Sackler Institute for Comparative Genomics, and M. Teresa Aguado, former postdoctoral fellow in the Museum’s Sackler Institute for Comparative Genomics who is now at the Autonomous University of Madrid, analyzed the transcriptome — the full set of RNA molecules — of a dozen female fireworms from Ferry Reach in Bermuda.
Their findings support previous work showing that fireworms “glow” because of a special luciferase enzyme they produce. These enzymes are the principal drivers of bioluminescence across the tree of life, in organisms as diverse as copepods, fungi, and jellyfish. However, the luciferases found in Bermuda fireworms and their relatives are distinct from those found in any other organism to date.
” The work also took a close look at genes related to the precise reproductive timing of the fireworms, as well as the changes that take place in the animals’ bodies just prior to swarming events. These changes include the enlargement and pigmentation of the worms’ four eyes and the modification of the nephridia — an organ similar to the kidney in vertebrates — to store and release gametes.