Professional

Feeding 10 billion people by 2050 within planetary limits may be achievable

Posted on


181009175643_1_540x360

A global shift towards healthy and more plant-based diets, halving food loss and waste, and improving farming practices and technologies are required to feed 10 billion people sustainably by 2050, a new study finds. Adopting these options reduces the risk of crossing global environmental limits related to climate change, the use of agricultural land, the extraction of freshwater resources, and the pollution of ecosystems through overapplication of fertilizers, according to the researchers.

The study,  is the first to quantify how food production and consumption affects the planetary boundaries that describe a safe operating space for humanity beyond which Earth’s vital systems could become unstable.

“No single solution is enough to avoid crossing planetary boundaries. But when the solutions are implemented together, our research indicates that it may be possible to feed the growing population sustainably,” says Dr Marco Springmann of the Oxford Martin Programme on the Future of Food and the Nuffield Department of Population Health at the University of Oxford, who led the study.

“Without concerted action, we found that the environmental impacts of the food system could increase by 50-90% by 2050 as a result of population growth and the rise of diets high in fats, sugars and meat. In that case, all planetary boundaries related to food production would be surpassed, some of them by more than twofold.”

The study, funded by EAT as part of the EAT-Lancet Commission for Food, Planet and Health and by Wellcome’s “Our Planet, Our Health” partnership on Livestock Environment and People, combined detailed environmental accounts with a model of the global food system that tracks the production and consumption of food across the world. With this model, the researchers analysed several options that could keep the food system within environmental limits. They found:

  • Climate change cannot be sufficiently mitigated without dietary changes towards more plant-based diets. Adopting more plant-based “flexitarian” diets globally could reduce greenhouse gas emissions by more than half, and also reduce other environmental impacts, such as fertilizer application and the use of cropland and freshwater, by a tenth to a quarter.
  • In addition to dietary changes, improving management practices and technologies in agriculture is required to limit pressures on agricultural land, freshwater extraction, and fertilizer use. Increasing agricultural yields from existing cropland, balancing application and recycling of fertilizers, and improving water management, could, along with other measures, reduce those impacts by around half.
  • Finally, halving food loss and waste is needed for keeping the food system within environmental limits. Halving food loss and waste could, if globally achieved, reduce environmental impacts by up to a sixth (16%).

“Many of the solutions we analysed are being implemented in some parts of the world, but it will need strong global co-ordination and rapid upscale to make their effects felt.”

“Improving farming technologies and management practices will require increasing investment in research and public infrastructure, the right incentive schemes for farmers, including support mechanisms to adopt best available practices, and better regulation, for example of fertilizer use and water quality,” says Line Gordon, executive director of the Stockholm Resilience Centre and an author on the report.

“Tackling food loss and waste will require measures across the entire food chain, from storage, and transport, over food packaging and labelling to changes in legislation and business behaviour that promote zero-waste supply chains.”

“When it comes to diets, comprehensive policy and business approaches are essential to make dietary changes towards healthy and more plant-based diets possible and attractive for a large number of people. Important aspects include school and workplace programmes, economic incentives and labelling, and aligning national dietary guidelines with the current scientific evidence on healthy eating and the environmental impacts of our diet,” adds Springmann.

Large-scale US wind power would cause warming that would take roughly a century to offset

Posted on


181004131831_1_540x360.jpg

All large-scale energy systems have environmental impacts, and the ability to compare the impacts of renewable energy sources is an important step in planning a future without coal or gas power. Extracting energy from the wind causes climatic impacts that are small compared to current projections of 21st century warming, but large compared to the effect of reducing US electricity emissions to zero with solar. Research publishing in the journal Joule on October 4 reports the most accurate modelling yet of how increasing wind power would affect climate, finding that large-scale wind power generation would warm the Continental United States 0.24 degrees Celsius because wind turbines redistribute heat in the atmosphere.

“Wind beats coal by any environmental measure, but that doesn’t mean that its impacts are negligible,” says senior author David Keith, an engineering and public policy professor at Harvard University. “We must quickly transition away from fossil fuels to stop carbon emissions. In doing so, we must make choices between various low-carbon technologies, all of which have some social and environmental impacts.”

“Wind turbines generate electricity but also alter the atmospheric flow,” says first author Lee Miller. “Those effects redistribute heat and moisture in the atmosphere, which impacts climate. We attempted to model these effects on a continental scale.”

To compare the impacts of wind and solar, Keith and Miller started by establishing a baseline for the 2012-2014 US climate using a standard weather forecasting model. Then they added in the effect on the atmosphere of covering one third of the Continental US with enough wind turbines to meet present-day US electricity demand. This is a relevant scenario if wind power plays a major role in decarbonizing the energy system in the latter half of this century. This scenario would warm the surface temperature of the Continental US by 0.24 degrees Celsius.

Their analysis focused on the comparison of climate impacts and benefits. They found that it would take about a century to offset that effect with wind-related reductions in greenhouse gas concentrations. This timescale was roughly independent of the specific choice of total wind power generation in their scenarios.

“The direct climate impacts of wind power are instant, while the benefits accumulate slowly,” says Keith. “If your perspective is the next 10 years, wind power actually has — in some respects — more climate impact than coal or gas. If your perspective is the next thousand years, then wind power is enormously cleaner than coal or gas.”

More than ten previous studies have now observed local warming caused by US wind farms. Keith and Miller compared their simulated warming to observations and found rough consistency between the observations and model.

They also compared wind power’s impacts with previous projections of solar power’s influence on the climate. They found that, for the same energy generation rate, solar power’s impacts would be about 10 times smaller than wind. But both sources of energy have their pros and cons.

“In terms of temperature difference per unit of energy generation, solar power has about 10 times less impact than wind,” says Miller. “But there are other considerations. For example, solar farms are dense, whereas the land between wind turbines can be co-utilized for agriculture.” The density of wind turbines and the time of day during which they operate can also influence the climatic impacts.

Keith and Miller’s simulations do not consider any impacts on global-scale meteorology, so it remains somewhat uncertain how such a deployment of wind power may affect the climate in other countries.

“The work should not be seen as a fundamental critique of wind power. Some of wind’s climate impacts may be beneficial. So rather, the work should be seen as a first step in getting more serious about assessing these impacts,” says Keith. “Our hope is that our study, combined with the recent direct observations, marks a turning point where wind power’s climatic impacts begin to receive serious consideration in strategic decisions about decarbonizing the energy system.”

 

Viruses influenced gene sharing between Neanderthals and humans

Posted on


181004143937_1_540x360

Human evolution used to be depicted as a straight line, gradually progressing from an ape-like ancestor to modern Homo sapiens. But thanks to next-generation sequencing — as well as the discovery of genetic material from extinct subspecies of early humans — findings in recent years have shown that it wasn’t quite so orderly. The human family tree is full of twists and branches that helped shape what we are today. Now, a study published in the journal Cell is reporting new details about the role of viruses in shaping evolution, in particular viral interactions between modern humans and Neanderthals.

“It’s not a stretch to imagine that when modern humans met up with Neanderthals, they infected each other with pathogens that came from their respective environments,” “By interbreeding with each other, they also passed along genetic adaptations to cope with some of those pathogens.”

Current thinking is that modern humans began moving out of Africa and into Eurasia about 70,000 years ago. When they arrived, they met up with Neanderthals who, along with their own ancestors, had been adapting to that geographic area for hundreds of thousands of years. The Eurasian environment shaped Neanderthals’ evolution, including the development of adaptations to viruses and other pathogens that were present there but not in Africa.

The Cell study provides new details about the role of adaptive introgression, or hybridization between species, in human evolution. “Some of the Neanderthals had adaptive mutations that gave them advantages against these pathogens, and they were able to pass some of these mutations on to modern humans,” explains Enard, who completed the work while he was a postdoctoral researcher at Stanford University. “That’s called positive natural selection — it favors certain individuals that carry these advantageous mutations.”

Their earlier research focused on how viruses impacted the evolution of humans. In 2016, they reported that about one-third of protein adaptations since humans split from other great apes was driven by a response to infectious viruses. The new work built on those findings looked at which of those adaptations may have come from Neanderthals.

In the current study, the investigators annotated thousands of genes in the human genome that are known to interact with pathogens — more than 4,000 of the 25,000 total genes. “We focused on these genes because the ones that interact with viruses are much more likely to have been involved in adaptation against infectious disease compared with genes that don’t have anything to do with viruses.”

They then looked at whether there was an enrichment of stretches of Neanderthal DNA in those 4,000 genes. Earlier studies from other groups have shown that Neanderthal DNA is present in humans. Those sequences are publicly available to investigators in the field. Based on the analysis, Enard and Petrov found strong evidence that adaptive genes that provided resistance against viruses were shared between Neanderthals and modern humans.

“Many Neanderthal sequences have been lost in modern humans, but some stayed and appear to have quickly increased to high frequencies at the time of contact, suggestive of their selective benefits at that time,” Petrov says. “Our research aims to understand why that was the case. We believe that resistance to specific RNA viruses provided by these Neanderthal sequences was likely a big part of the reason for their selective benefits.”

“One of the things that population geneticists have wondered about is why we have maintained these stretches of Neanderthal DNA in our own genomes,” Enard adds. “This study suggests that one of the roles of those genes was to provide us with some protection against pathogens as we moved into new environments.”

 

New color-generation mechanism discovered in ‘rainbow’ weevil

Posted on


180911095905_1_540x360.jpg

Researchers from Yale-NUS College and the University of Fribourg in Switzerland have discovered a novel colour-generation mechanism in nature, which if harnessed, has the potential to create cosmetics and paints with purer and more vivid hues, screen displays that project the same true image when viewed from any angle, and even reduce the signal loss in optical fibres.  Dr Saranathan examined the rainbow-coloured patterns in the elytra (wing casings) of a snout weevil from the Philippines, Pachyrrhynchus congestus pavonius, using high-energy X-rays, while Dr Wilts performed detailed scanning electron microscopy and optical modelling. They discovered that to produce the rainbow palette of colours, the weevil utilised a colour-generation mechanism that is so far found only in squid, cuttlefish, and octopuses, which are renowned for their colour-shifting camouflage. The study was published in the peer-reviewed journal Small.

P. c. pavonius, or the “Rainbow” Weevil, is distinctive for its rainbow-coloured spots on its thorax and elytra. These spots are made up of nearly-circular scales arranged in concentric rings of different hues, ranging from blue in the centre to red at the outside, just like a rainbow. While many insects have the ability to produce one or two colours, it is rare that a single insect can produce such a vast spectrum of colours. Researchers are interested to figure out the mechanism behind the natural formation of these colour-generating structures, as current technology is unable to synthesise structures of this size.

“The ultimate aim of research in this field is to figure out how the weevil self-assembles these structures, because with our current technology we are unable to do so,” Dr Saranathan said. “The ability to produce these structures, which are able to provide a high colour fidelity regardless of the angle you view it from, will have applications in any industry which deals with colour production. We can use these structures in cosmetics and other pigmentations to ensure high-fidelity hues, or in digital displays in your phone or tablet which will allow you to view it from any angle and see the same true image without any colour distortion. We can even use them to make reflective cladding for optical fibres to minimise signal loss during transmission.”

Dr Saranathan and Dr Wilts examined these scales to determine that the scales were composed of a three-dimensional crystalline structure made from chitin (the main ingredient in insect exoskeletons). They discovered that the vibrant rainbow colours on this weevil’s scales are determined by two factors: the size of the crystal structure which makes up each scale, as well as the volume of chitin used to make up the crystal structure. Larger scales have a larger crystalline structure and use a larger volume of chitin to reflect red light; smaller scales have a smaller crystalline structure and use a smaller volume of chitin to reflect blue light. According to Dr Saranathan, who previously examined over 100 species of insects and spiders and catalogued their colour-generation mechanisms, this ability to simultaneously control both size and volume factors to fine-tune the colour produced has never before been shown in insects, and given its complexity, is quite remarkable. “It is different from the usual strategy employed by nature to produce various different hues on the same animal, where the chitin structures are of fixed size and volume, and different colours are generated by orienting the structure at different angles, which reflects different wavelengths of light,” Dr Saranathan explained.

We may hear others’ footsteps, but how do we ignore our own?

Posted on


180912133507_1_540x360.jpg

A team of scientists has uncovered the neural processes mice use to ignore their own footsteps, a discovery that offers new insights into how we learn to speak and play music.

“The ability to ignore one’s own footsteps requires the brain to store and recall memories and to make some pretty stellar computations,” explains David Schneider, an assistant professor at New York University’s Center for Neural Science and one of the paper’s lead authors. “These are the building blocks for other, more important sound-generating behaviors, like recognizing the sounds you make when learning how to speak or to play a musical instrument.”

The research, centered on an intuition — that we are usually unaware of the sound of our own footsteps — as a vehicle for understanding larger neural phenomena: how this behavior reveals the ability to monitor, recognize, and remember the sound of one’s own movements in relation to those of their larger environments.

“The capacity to anticipate and discriminate these movement-related sounds from environmental sounds is critical to normal hearing,” Schneider explains. “But how the brain learns to anticipate the sounds resulting from our movements remains largely unknown.”

To explore this, Schneider and his colleagues, designed an “acoustic virtual reality system” for the mice. Here, the scientists controlled the sounds the mice made walking on a treadmill while monitoring the animals’ neural activity, allowing them to identify the neural circuit mechanisms that learn to suppress movement-related sounds.

Overall, they found a flexibility in neural function — the mice developed an adjustable “sensory filter” that allowed them to ignore the sounds of their own footsteps. In turn, this allowed them to better detect other sounds arising from their surroundings.

“For mice, this is really important,” said Schneider. “They are prey animals, so they really need to be able to listen for a cat creeping up on them, even when they’re walking and making noise.”

Being able to ignore the sounds of one’s own movements is likely important for humans as well. But the ability to anticipate the sounds of our actions is also important for more complex human behaviors such as speaking or playing music.

“When we learn to speak or to play music, we predict what sounds we’re going to hear — such as when we prepare to strike keys on a piano — and we compare this to what we actually hear,” explains Schneider. “We use mismatches between expectation and experience to change how we play — and we get better over time because our brain is trying to minimize these errors.”

Being unable to make predictions like this is also thought to be involved in a spectrum of afflictions.

“Overactive prediction circuits in the brain are thought to lead to the voice-like hallucinations associated with schizophrenia while an inability to learn the consequences of one’s actions could lead to debilitating social paralysis, as in autism,” explains Schneider. “By figuring out how the brain normally makes predictions about self-generated sounds, we open the opportunity for understanding a fascinating ability — predicting the future — and for deepening our understanding of how the brain breaks during disease.”

Falling stars hold clue for understanding dying stars

Posted on


180903135957_1_540x360.jpg

An international team of researchers has proposed a new method to investigate the inner workings of supernovae explosions. This new method uses meteorites and is unique in that it can determine the contribution from electron anti-neutrinos, enigmatic particles which can’t be tracked through other means.

Supernovae are important events in the evolution of stars and galaxies, but the details of how the explosions occur are still unknown. By measuring the amount of 98Ru (an isotope of Ruthenium) in meteorites, it should be possible to estimate how much of its progenitor 98Tc (a short-lived isotope of Technetium) was present in the material from which the Solar System formed. The amount of 98Tc in turn is sensitive to the characteristics, such as temperature, of electron anti-neutrinos in the supernova process; as well as to how much time passed between the supernova and the formation of the Solar System. The expected traces of 98Tc are only a little below the smallest currently detectable levels, raising hopes that they will be measured in the near future.

“There are six neutrino species. Previous studies have shown that neutrino-isotopes are predominantly produced by the five neutrino species other than the electron anti-neutrino. By finding a neutrino-isotope synthesized predominantly by the electron anti-neutrino, we can estimate the temperatures of all six neutrino species, which are important for understanding the supernova explosion mechanism.”

At the end of its life, a massive star dies in a fiery explosion known as a supernova. This explosion blasts most of the mass in the star out into outer space. That mass is then recycled into new stars and planets, leaving distinct chemical signatures which tell scientists about the supernova. Meteorites, sometimes called falling stars, formed from material left over from the birth of the Solar System, thus preserving the original chemical signatures.

Massive monumental cemetery built by Eastern Africa’s earliest herders discovered in Kenya

Posted on


180820155114_1_540x360.jpg

An international team, including researchers at Stony Brook University and the Max Planck Institute for the Science of Human History, has found the earliest and largest monumental cemetery in eastern Africa. The Lothagam North Pillar Site was built 5,000 years ago by early pastoralists living around Lake Turkana, Kenya. This group is believed to have had an egalitarian society, without a stratified social hierarchy. Thus their construction of such a large public project contradicts long-standing narratives about early complex societies, which suggest that a stratified social structure is necessary to enable the construction of large public buildings or monuments.

The Lothagam North Pillar Site was a communal cemetery constructed and used over a period of several centuries, between about 5,000 and 4,300 years ago. Early herders built a platform approximately 30 meters in diameter and excavated a large cavity in the center to bury their dead. After the cavity was filled and capped with stones, the builders placed large, megalith pillars, some sourced from as much as a kilometer away, on top. Stone circles and cairns were added nearby. An estimated minimum of 580 individuals were densely buried within the central platform cavity of the site. Men, women, and children of different ages, from infants to the elderly, were all buried in the same area, without any particular burials being singled out with special treatment. Additionally, essentially all individuals were buried with personal ornaments and the distribution of ornaments was approximately equal throughout the cemetery. These factors indicate a relatively egalitarian society without strong social stratification.

Historically, archeologists have theorized that people built permanent monuments as reminders of shared history, ideals and culture, when they had established a settled, socially stratified agriculture society with abundant resources and strong leadership. It was believed that a political structure and the resources for specialization were prerequisites to engaging in monument building. Ancient monuments have thus previously been regarded as reliable indicators of complex societies with differentiated social classes. However, the Lothagam North cemetery was constructed by mobile pastoralists who show no evidence of a rigid social hierarchy. “This discovery challenges earlier ideas about monumentality,” explains Elizabeth Sawchuk of Stony Brook University and the Max Planck Institute for the Science of Human History. “Absent other evidence, Lothagam North provides an example of monumentality that is not demonstrably linked to the emergence of hierarchy, forcing us to consider other narratives of social change.”

The discovery is consistent with similar examples elsewhere in Africa and on other continents in which large, monumental structures have been built by groups thought to be egalitarian in their social organization. This research has the potential to reshape global perspectives on how — and why — large groups of people come together to form complex societies. In this case, it appears that Lothagam North was built during a period of profound change. Pastoralism had just been introduced to the Turkana Basin and newcomers arriving with sheep, goats, and cattle would have encountered diverse groups of fisher-hunter-gatherers already living around the lake. Additionally, newcomers and locals faced a difficult environmental situation, as annual rainfall decreased during this period and Lake Turkana shrunk by as much as fifty percent. Early herders may have constructed the cemetery as a place for people to come together to form and maintain social networks to cope with major economic and environmental change.

“The monuments may have served as a place for people to congregate, renew social ties, and reinforce community identity,” states Anneke Janzen also of the Max Planck Institute for the Science of Human History. “Information exchange and interaction through shared ritual may have helped mobile herders navigate a rapidly changing physical landscape.” After several centuries, pastoralism became entrenched and lake levels stabilized. It was around this time that the cemetery ceased to be used.

“The Lothagam North Pillar Site is the earliest known monumental site in eastern Africa, built by the region’s first herders,” states Hildebrand. “This finding makes us reconsider how we define social complexity, and the kinds of motives that lead groups of people to create public architecture.”

Congenital blindness reversed in mice

Posted on


180815130544_1_540x360.jpg

Researchers funded by the National Eye Institute (NEI) have reversed congenital blindness in mice by changing supportive cells in the retina called Müller glia into rod photoreceptors. The findings advance efforts toward regenerative therapies for blinding diseases such as age-related macular degeneration and retinitis pigmentosa.

“This is the first report of scientists reprogramming Müller glia to become functional rod photoreceptors in the mammalian retina.” “Rods allow us to see in low light, but they may also help preserve cone photoreceptors, which are important for color vision and high visual acuity. Cones tend to die in later-stage eye diseases. If rods can be regenerated from inside the eye, this might be a strategy for treating diseases of the eye that affect photoreceptors.”

Photoreceptors are light-sensitive cells in the retina in the back of the eye that signal the brain when activated. In mammals, including mice and humans, photoreceptors fail to regenerate on their own. Like most neurons, once mature they don’t divide.

Scientists have long studied the regenerative potential of Müller glia because in other species, such as zebrafish, they divide in response to injury and can turn into photoreceptors and other retinal neurons. The zebrafish can thus regain vision after severe retinal injury. In the lab, however, scientists can coax mammalian Müller glia to behave more like they do in the fish. But it requires injuring the tissue.

“From a practical standpoint, if you’re trying to regenerate the retina to restore a person’s vision, it is counterproductive to injure it first to activate the Müller glia.”

“We wanted to see if we could program Müller glia to become rod photoreceptors in a living mouse without having to injure its retina.”

In the first phase of a two-stage reprogramming process Chen’s team spurred Müller glia in normal mice to divide by injecting their eyes with a gene to turn on a protein called beta-catenin. Weeks later, they injected the mice’s eyes with factors that encouraged the newly divided cells to develop into rod photoreceptors.

The researchers used microscopy to visually track the newly formed cells. They found that the newly formed rod photoreceptors looked structurally no different from real photoreceptors. In addition, synaptic structures that allow the rods to communicate with other types of neurons within the retina had also formed. To determine whether the Müller glia-derived rod photoreceptors were functional, they tested the treatment in mice with congenital blindness, which meant that they were born without functional rod photoreceptors.

In the treated mice that were born blind, Müller glia-derived rods developed just as effectively as they had in normal mice. Functionally, they confirmed that the newly formed rods were communicating with other types of retinal neurons across synapses. Furthermore, light responses recorded from retinal ganglion cells — neurons that carry signals from photoreceptors to the brain — and measurements of brain activity confirmed that the newly-formed rods were in fact integrating in the visual pathway circuitry, from the retina to the primary visual cortex in the brain.

Chen’s lab is conducting behavioral studies to determine whether the mice have regained the ability to perform visual tasks such as a water maze task. Chen also plans to see if the technique works on cultured human retinal tissue.

Printable tags turn everyday objects into smart, connected devices

Posted on


180816091442_1_540x360.jpg

Engineers have developed printable metal tags that could be attached to everyday objects and turn them into “smart” Internet of Things devices.

The metal tags are made from patterns of copper foil printed onto thin, flexible, paper-like substrates and are made to reflect WiFi signals. The tags work essentially like “mirrors” that reflect radio signals from a WiFi router. When a user’s finger touches these mirrors, it disturbs the reflected WiFi signals in such a way that can be remotely sensed by a WiFi receiver, like a smartphone.

The tags can be tacked onto plain objects that people touch and interact with every day, like water bottles, walls or doors. These plain objects then essentially become smart, connected devices that can signal a WiFi device whenever a user interacts with them. The tags can also be fashioned into thin keypads or smart home control panels that can be used to remotely operate WiFi-connected speakers, smart lights and other Internet of Things appliances.

“Our vision is to expand the Internet of Things to go beyond just connecting smartphones, smartwatches and other high-end devices,” said senior author Xinyu Zhang, a professor of electrical and computer engineering at the UC San Diego Jacobs School of Engineering and member of the Center for Wireless Communications at UC San Diego. “We’re developing low-cost, battery-free, chipless, printable sensors that can include everyday objects as part of the Internet of Things.”

Zhang’s team named the technology “LiveTag.” These metal tags are designed to only reflect specific signals within in the WiFi frequency range. By changing the type of material they’re made of and the pattern in which they’re printed, the researchers can redesign the tags to reflect either Bluetooth, LTE or cellular signals.

The tags have no batteries, silicon chips, or any discrete electronic components, so they require hardly any maintenance — no batteries to change, no circuits to fix.

The team presented their work at the recent USENIX Symposium on Networked Systems Design and Implementation Conference.

Smart tagging

As a proof of concept, the researchers used LiveTag to create a paper-thin music player controller complete with a play/pause button, next track button and sliding bar for tuning volume. The buttons and sliding bar each consist of at least one metal tag so touching any of them sends signals to a WiFi device. The researchers have so far only tested the LiveTag music player controller to remotely trigger a WiFi receiver, but they envision that it would be able to remotely control WiFi-connected music players or speakers when attached to a wall, couch armrest, clothes, or other ordinary surface.

The researchers also adapted LiveTag as a hydration monitor. They attached it to a plastic water bottle and showed that it could be used to track a user’s water intake by monitoring the water level in the bottle. The water inside affects the tag’s response in the same way a finger touch would — as long as the bottle is not made of metal, which would block the signal. The tag has multiple resonators that each get detuned at a specific water level. The researchers imagine that the tag could be used to deliver reminders to a user’s smartphone to prevent dehydration.

Future applications

On a broader scope, Zhang envisions using LiveTag technology to track human interaction with everyday objects. For example, LiveTag could potentially be used as an inexpensive way to assess the recovery of patients who have suffered from stroke.

“When patients return home, they could use this technology to provide data on their motor activity based on how they interact with everyday objects at home — whether they are opening or closing doors in a normal way, or if they are able to pick up bottles of water, for example. The amount, intensity and frequency of their activities could be logged and sent to their doctors to evaluate their recovery,” said Zhang. “And this can all be done in the comfort of their own homes rather than having to keep going back to the clinic for frequent motor activity testing,” he added.

Another example is tagging products at retail stores and assessing customer interest based on which products they touch. Rather than use cameras, stores could use LiveTag as an alternative that offers customers more privacy.

Next steps

The researchers note several limitations of the technology. LiveTag currently cannot work with a WiFi receiver further than one meter (three feet) away, so researchers are working on improving the tag sensitivity and detection range.

Laziness helped lead to extinction of Homo erectus

Posted on


180810091542_1_540x360.jpg

New archaeological research from The Australian National University (ANU) has found that Homo erectus, an extinct species of primitive humans, went extinct in part because they were ‘lazy’.

An archaeological excavation of ancient human populations in the Arabian Peninsula during the Early Stone Age, found that Homo erectus used ‘least-effort strategies’ for tool making and collecting resources.

This ‘laziness’ paired with an inability to adapt to a changing climate likely played a role in the species going extinct.

“They really don’t seem to have been pushing themselves.”

“I don’t get the sense they were explorers looking over the horizon. They didn’t have that same sense of wonder that we have.”

Dr Shipton said this was evident in the way the species made their stone tools and collected resources.

“To make their stone tools they would use whatever rocks they could find lying around their camp, which were mostly of comparatively low quality to what later stone tool makers used.”

“At the site we looked at there was a big rocky outcrop of quality stone just a short distance away up a small hill.

“But rather than walk up the hill they would just use whatever bits had rolled down and were lying at the bottom.

“When we looked at the rocky outcrop there were no signs of any activity, no artefacts and no quarrying of the stone.

“They knew it was there, but because they had enough adequate resources they seem to have thought, ‘why bother?’.”

This is in contrast to the stone tool makers of later periods, including early Homo sapiens and Neanderthals, who were climbing mountains to find good quality stone and transporting it over long distances.

Dr Shipton said a failure to progress technologically, as their environment dried out into a desert, also contributed to the population’s demise.

“Not only were they lazy, but they were also very conservative,” Dr Shipton said.

“The sediment samples showed the environment around them was changing, but they were doing the exact same things with their tools.

“There was no progression at all, and their tools are never very far from these now dry river beds. I think in the end the environment just got too dry for them.”

The excavation and survey work was undertaken in 2014 at the site of Saffaqah near Dawadmi in central Saudi Arabia.