A team of scientists has uncovered the neural processes mice use to ignore their own footsteps, a discovery that offers new insights into how we learn to speak and play music.
“The ability to ignore one’s own footsteps requires the brain to store and recall memories and to make some pretty stellar computations,” explains David Schneider, an assistant professor at New York University’s Center for Neural Science and one of the paper’s lead authors. “These are the building blocks for other, more important sound-generating behaviors, like recognizing the sounds you make when learning how to speak or to play a musical instrument.”
The research, centered on an intuition — that we are usually unaware of the sound of our own footsteps — as a vehicle for understanding larger neural phenomena: how this behavior reveals the ability to monitor, recognize, and remember the sound of one’s own movements in relation to those of their larger environments.
“The capacity to anticipate and discriminate these movement-related sounds from environmental sounds is critical to normal hearing,” Schneider explains. “But how the brain learns to anticipate the sounds resulting from our movements remains largely unknown.”
To explore this, Schneider and his colleagues, designed an “acoustic virtual reality system” for the mice. Here, the scientists controlled the sounds the mice made walking on a treadmill while monitoring the animals’ neural activity, allowing them to identify the neural circuit mechanisms that learn to suppress movement-related sounds.
Overall, they found a flexibility in neural function — the mice developed an adjustable “sensory filter” that allowed them to ignore the sounds of their own footsteps. In turn, this allowed them to better detect other sounds arising from their surroundings.
“For mice, this is really important,” said Schneider. “They are prey animals, so they really need to be able to listen for a cat creeping up on them, even when they’re walking and making noise.”
Being able to ignore the sounds of one’s own movements is likely important for humans as well. But the ability to anticipate the sounds of our actions is also important for more complex human behaviors such as speaking or playing music.
“When we learn to speak or to play music, we predict what sounds we’re going to hear — such as when we prepare to strike keys on a piano — and we compare this to what we actually hear,” explains Schneider. “We use mismatches between expectation and experience to change how we play — and we get better over time because our brain is trying to minimize these errors.”
Being unable to make predictions like this is also thought to be involved in a spectrum of afflictions.
“Overactive prediction circuits in the brain are thought to lead to the voice-like hallucinations associated with schizophrenia while an inability to learn the consequences of one’s actions could lead to debilitating social paralysis, as in autism,” explains Schneider. “By figuring out how the brain normally makes predictions about self-generated sounds, we open the opportunity for understanding a fascinating ability — predicting the future — and for deepening our understanding of how the brain breaks during disease.”
An international team of researchers has proposed a new method to investigate the inner workings of supernovae explosions. This new method uses meteorites and is unique in that it can determine the contribution from electron anti-neutrinos, enigmatic particles which can’t be tracked through other means.
Supernovae are important events in the evolution of stars and galaxies, but the details of how the explosions occur are still unknown. By measuring the amount of 98Ru (an isotope of Ruthenium) in meteorites, it should be possible to estimate how much of its progenitor 98Tc (a short-lived isotope of Technetium) was present in the material from which the Solar System formed. The amount of 98Tc in turn is sensitive to the characteristics, such as temperature, of electron anti-neutrinos in the supernova process; as well as to how much time passed between the supernova and the formation of the Solar System. The expected traces of 98Tc are only a little below the smallest currently detectable levels, raising hopes that they will be measured in the near future.
“There are six neutrino species. Previous studies have shown that neutrino-isotopes are predominantly produced by the five neutrino species other than the electron anti-neutrino. By finding a neutrino-isotope synthesized predominantly by the electron anti-neutrino, we can estimate the temperatures of all six neutrino species, which are important for understanding the supernova explosion mechanism.”
At the end of its life, a massive star dies in a fiery explosion known as a supernova. This explosion blasts most of the mass in the star out into outer space. That mass is then recycled into new stars and planets, leaving distinct chemical signatures which tell scientists about the supernova. Meteorites, sometimes called falling stars, formed from material left over from the birth of the Solar System, thus preserving the original chemical signatures.
A rechargeable battery technology developed at the University of Michigan could double the output of today’s lithium ion cells — drastically extending electric vehicle ranges and time between cell phone charges — without taking up any added space.
By using a ceramic, solid-state electrolyte, engineers can harness the power of lithium metal batteries without the historic issues of poor durability and short-circuiting. The result is a roadmap to what could be the next generation of rechargeable batteries.
In the 1980s, rechargeable lithium metal batteries that used liquid electrolytes were considered the next big thing, penetrating the market in early portable phones. But their propensity to combust when charged led engineers in different directions. The lithium atoms that shuttle between the electrodes tended to build tree-like filaments called dendrites on the electrode surfaces, eventually shorting the battery and igniting the flammable electrolyte.
The lithium ion battery — a more stable, but less energy-dense technology — was introduced in 1991 and quickly became the new standard. These batteries replaced lithium metal with graphite anodes, which absorb the lithium and prevent dendrites from forming, but also come with performance costs:
Graphite can hold only one lithium ion for every six carbon atoms, giving it a specific capacity of approximately 350 milliampere hours per gram (mAh/g.) The lithium metal in a solid state battery has a specific capacity of 3,800 mAh/g.
Current lithium ion batteries max out with a total energy density around 600 watt-hours per liter (Wh/L) at the cell level. In principal, solid-state batteries can reach 1,200 Wh/L.
To solve lithium metal’s combustion problem, U-M engineers created a ceramic layer that stabilizes the surface — keeping dendrites from forming and preventing fires. It allows batteries to harness the benefits of lithium metal — energy density and high-conductivity — without the dangers of fires or degradation over time.
“What we’ve come up with is a different approach — physically stabilizing the lithium metal surface with a ceramic,” Sakamoto said. “It’s not combustible. We make it at over 1,800 degrees Fahrenheit in air. And there’s no liquid, which is what typically fuels the battery fires you see.
“You get rid of that fuel, you get rid of the combustion.”
In earlier solid state electrolyte tests, lithium metal grew through the ceramic electrolyte at low charging rates, causing a short circuit, much like that in liquid cells. U-M researchers solved this problem with chemical and mechanical treatments that provide a pristine surface for lithium to plate evenly, effectively suppressing the formation of dendrites or filaments. Not only does this improve safety, it enables a dramatic improvement in charging rates, Sakamoto said.
“Up until now, the rates at which you could plate lithium would mean you’d have to charge a lithium metal car battery over 20 to 50 hours (for full power),” Sakamoto said. “With this breakthrough, we demonstrated we can charge the battery in 3 hours or less.
“We’re talking a factor of 10 increase in charging speed compared to previous reports for solid state lithium metal batteries. We’re now on par with lithium ion cells in terms of charging rates, but with additional benefits. ”
That charge/recharge process is what inevitably leads to the eventual death of a lithium ion battery. Repeatedly exchanging ions between the cathode and anode produces visible degradation right out of the box.
In testing the ceramic electrolyte, however, no visible degradation is observed after long term cycling, said Nathan Taylor, a U-M post-doctoral fellow in mechanical engineering.
“We did the same test for 22 days,” he said. “The battery was just the same at the start as it was at the end. We didn’t see any degradation. We aren’t aware of any other bulk solid state electrolyte performing this well for this long.”
Bulk solid state electrolytes enable cells that are a drop-in replacement for current lithium ion batteries and could leverage existing battery manufacturing technology. With the material performance verified, the research group has begun producing thin solid electrolyte layers required to meet solid state capacity targets.
Engineers have developed printable metal tags that could be attached to everyday objects and turn them into “smart” Internet of Things devices.
The metal tags are made from patterns of copper foil printed onto thin, flexible, paper-like substrates and are made to reflect WiFi signals. The tags work essentially like “mirrors” that reflect radio signals from a WiFi router. When a user’s finger touches these mirrors, it disturbs the reflected WiFi signals in such a way that can be remotely sensed by a WiFi receiver, like a smartphone.
The tags can be tacked onto plain objects that people touch and interact with every day, like water bottles, walls or doors. These plain objects then essentially become smart, connected devices that can signal a WiFi device whenever a user interacts with them. The tags can also be fashioned into thin keypads or smart home control panels that can be used to remotely operate WiFi-connected speakers, smart lights and other Internet of Things appliances.
“Our vision is to expand the Internet of Things to go beyond just connecting smartphones, smartwatches and other high-end devices,” said senior author Xinyu Zhang, a professor of electrical and computer engineering at the UC San Diego Jacobs School of Engineering and member of the Center for Wireless Communications at UC San Diego. “We’re developing low-cost, battery-free, chipless, printable sensors that can include everyday objects as part of the Internet of Things.”
Zhang’s team named the technology “LiveTag.” These metal tags are designed to only reflect specific signals within in the WiFi frequency range. By changing the type of material they’re made of and the pattern in which they’re printed, the researchers can redesign the tags to reflect either Bluetooth, LTE or cellular signals.
The tags have no batteries, silicon chips, or any discrete electronic components, so they require hardly any maintenance — no batteries to change, no circuits to fix.
The team presented their work at the recent USENIX Symposium on Networked Systems Design and Implementation Conference.
As a proof of concept, the researchers used LiveTag to create a paper-thin music player controller complete with a play/pause button, next track button and sliding bar for tuning volume. The buttons and sliding bar each consist of at least one metal tag so touching any of them sends signals to a WiFi device. The researchers have so far only tested the LiveTag music player controller to remotely trigger a WiFi receiver, but they envision that it would be able to remotely control WiFi-connected music players or speakers when attached to a wall, couch armrest, clothes, or other ordinary surface.
The researchers also adapted LiveTag as a hydration monitor. They attached it to a plastic water bottle and showed that it could be used to track a user’s water intake by monitoring the water level in the bottle. The water inside affects the tag’s response in the same way a finger touch would — as long as the bottle is not made of metal, which would block the signal. The tag has multiple resonators that each get detuned at a specific water level. The researchers imagine that the tag could be used to deliver reminders to a user’s smartphone to prevent dehydration.
On a broader scope, Zhang envisions using LiveTag technology to track human interaction with everyday objects. For example, LiveTag could potentially be used as an inexpensive way to assess the recovery of patients who have suffered from stroke.
“When patients return home, they could use this technology to provide data on their motor activity based on how they interact with everyday objects at home — whether they are opening or closing doors in a normal way, or if they are able to pick up bottles of water, for example. The amount, intensity and frequency of their activities could be logged and sent to their doctors to evaluate their recovery,” said Zhang. “And this can all be done in the comfort of their own homes rather than having to keep going back to the clinic for frequent motor activity testing,” he added.
Another example is tagging products at retail stores and assessing customer interest based on which products they touch. Rather than use cameras, stores could use LiveTag as an alternative that offers customers more privacy.
The researchers note several limitations of the technology. LiveTag currently cannot work with a WiFi receiver further than one meter (three feet) away, so researchers are working on improving the tag sensitivity and detection range.
For the first time, researchers were able to study quantum interference in a three-level quantum system and thereby control the behavior of individual electron spins. To this end, they used a novel nanostructure, in which a quantum system is integrated into a nanoscale mechanical oscillator in form of a diamond cantilever. Nature Physics has published the study that was conducted at the University of Basel and the Swiss Nanoscience Institute.
The electronic spin is a fundamental quantum mechanical property intrinsic to every electron. In the quantum world, the electronic spin describes the direction of rotation of the electron around its axis which can normally occupy two so-called eigenstates commonly denoted as “up” and “down.” The quantum properties of such spins offer interesting perspectives for future technologies, for example in the form of extremely precise quantum sensors.
Combining spins with mechanical oscillators
Researchers led by Professor Patrick Maletinsky and PhD candidate Arne Barfuss from the Swiss Nanoscience Institute at the University of Basel report in Nature Physics a new method to control the spins’ quantum behavior through a mechanical system.
For their experimental study, they combined such a quantum system with a mechanical oscillator. More specifically, the researchers employed electrons trapped in so-called nitrogen-vacancy centers and embedded these spins in single-crystalline mechanical resonators made from diamond.
These nitrogen-vacancy spins are special, in that they possess not only two, but three eigenstates, which can be described as “up,” “down” and “zero.” Using the special coupling of a mechanical oscillator to the spin, they showed for the first time a complete quantum control over such a three-level system, in a way not possible before.
Controlling three quantum states
In particular, the oscillator allowed them to address all three possible transitions in the spin and to study how the resulting excitation pathways interfere with each other.
This scenario, known as “closed-contour driving,” has never been investigated so far but opens interesting fundamental and practical perspectives. For example, their experiment allowed for a breaking of time-reversal symmetry, which means that the properties of the system look fundamentally different if the direction of time is reversed than without such inversion. In this scenario, the phase of the mechanical oscillator determined whether the spin circled “clockwise” (direction of rotation up, down, zero, up) or “counter-clockwise.”
This abstract concept has practical consequences for the fragile quantum states. Similar to the well-known Schrödinger’s cat, spins can be simultaneously in a superposition of two or three of the available eigenstates for a certain period, the so-called quantum coherence time.
If the three eigenstates are coupled to each other using the closed contour driving discovered here, the coherence time can be significantly extended, as the researchers were able to show. Compared to systems where only two of the three possible transitions are driven, coherence increased almost a hundredfold.
Such coherence protection is a key element for future quantum technologies and another main result of this work.
Applications for sensor technology
The work described here holds high potential for future applications. It is conceivable that the hybrid resonator-spin system could be used for the precise measurement of time-dependent signals with frequencies in the gigahertz range — for example in quantum sensing or quantum information processing. For time-dependent signals emerging from nanoscale objects, such tasks are currently very difficult to address otherwise. Here the combination of spin and an oscillating system could provide helpful, in particular also because of the demonstrated protection of spin coherence.
A new way of arranging advanced computer components called memristors on a chip could enable them to be used for general computing, which could cut energy consumption by a factor of 100.
This would improve performance in low power environments such as smartphones or make for more efficient supercomputers.
“Historically, the semiconductor industry has improved performance by making devices faster. But although the processors and memories are very fast, they can’t be efficient because they have to wait for data to come in and out,” said Wei Lu, U-M professor of electrical and computer engineering and co-founder of memristor startup Crossbar Inc.
Memristors might be the answer. Named as a portmanteau of memory and resistor, they can be programmed to have different resistance states — meaning they store information as resistance levels. These circuit elements enable memory and processing in the same device, cutting out the data transfer bottleneck experienced by conventional computers in which the memory is separate from the processor.
However, unlike ordinary bits, which are 1 or 0, memristors can have resistances that are on a continuum. Some applications, such as computing that mimics the brain (neuromorphic), take advantage of the analog nature of memristors. But for ordinary computing, trying to differentiate among small variations in the current passing through a memristor device is not precise enough for numerical calculations.
Lu and his colleagues got around this problem by digitizing the current outputs — defining current ranges as specific bit values (i.e., 0 or 1). The team was also able to map large mathematical problems into smaller blocks within the array, improving the efficiency and flexibility of the system.
Computers with these new blocks, which the researchers call “memory-processing units,” could be particularly useful for implementing machine learning and artificial intelligence algorithms. They are also well suited to tasks that are based on matrix operations, such as simulations used for weather prediction. The simplest mathematical matrices, akin to tables with rows and columns of numbers, can map directly onto the grid of memristors.
Once the memristors are set to represent the numbers, operations that multiply and sum the rows and columns can be taken care of simultaneously, with a set of voltage pulses along the rows. The current measured at the end of each column contains the answers. A typical processor, in contrast, would have to read the value from each cell of the matrix, perform multiplication, and then sum up each column in series.
“We get the multiplication and addition in one step. It’s taken care of through physical laws. We don’t need to manually multiply and sum in a processor,” Lu said.
His team chose to solve partial differential equations as a test for a 32×32 memristor array — which Lu imagines as just one block of a future system. These equations, including those behind weather forecasting, underpin many problems science and engineering but are very challenging to solve. The difficulty comes from the complicated forms and multiple variables needed to model physical phenomena.
When solving partial differential equations exactly is impossible, solving them approximately can require supercomputers. These problems often involve very large matrices of data, so the memory-processor communication bottleneck is neatly solved with a memristor array. such as those used for integrated circuit fabrication.
Many marine protected areas are often unnecessarily expensive and located in the wrong places.
The University of Queensland was part of research which found protected areas missed many unique ecosystems, and have a greater impact on fisheries than necessary.
A collaboration with the University of Hamburg, Wildlife Conservation Society and The Nature Conservancy assessed the efficiency of marine protected areas, which now cover 16 per cent of national waters around the world.
UQ’s School of Biological Sciences researcher Professor Hugh Possingham said international marine preservation targets are falling short.
“International conservation targets such as the United Nation’s Sustainable Development Goals call for protection of at least 10 per cent of all the world’s oceans and all marine ecosystems.”
“Despite a tenfold increase in marine protected areas since the year 2000 — a growth of 21 million square kilometres — half of all marine ecosystems still fall short of the target, with 10 ecosystems entirely unprotected.”
The researchers assessed whether the expansion had been cost efficient — measuring potential earnings lost from fisheries — and effectively focused.
The University of Hamburg’s Dr Kerstin Jantke, who led the research, said that marine protected areas could have been far more efficient with greater planning.
“With a more strategic approach at the inception of global conservation targets in 1982, the marine protected area network could be a third smaller, cost half as much, and meet international targets by protecting 10 per cent of every ecosystem,” she said.
“It is clearly in the interests of nations to start strategic planning as early as possible to avoid costly imbalanced reserve systems.”
Nations will negotiate new conservation targets for 2020-2030 at a United Nations meeting next year in China.
“We urge governments to take note and be tactical from the outset, delivering better outcomes for nature conservation, but also saving them a lot of money.”
Researchers who’ve analyzed ancient mitochondrial (mt)DNA isolated from a 22,000-year-old panda found in Cizhutuo Cave in the Guangxi Province of China — a place where no pandas live today — have revealed a new lineage of giant panda. shows that the ancient panda separated from present-day pandas 144,000 to 227,000 years ago, suggesting that it belonged to a distinct group not found.
The newly sequenced mitochondrial genome represents the oldest DNA evidence from pandas.
“Using a single complete mtDNA sequence, we find a distinct mitochondrial lineage, suggesting that the Cizhutuo panda, while genetically more closely related to present-day pandas than other bears, has a deep, separate history from the common ancestor of present-day pandas,” says Qiaomei Fu from the Chinese Academy of Sciences. “This really highlights that we need to sequence more DNA from ancient pandas to really capture how their genetic diversity has changed through time and how that relates to their current, much more restricted and fragmented habitat.”
Very little has been known about pandas’ past, especially in regions outside of their current range in Shaanxi province or Gansu and Sichuan provinces. Evidence suggests that pandas in the past were much more widespread, but it’s been unclear how those pandas were related to pandas of today.
In the new study, the researchers used sophisticated methods to fish mitochondrial DNA from the ancient cave specimen. That’s a particular challenge because the specimen comes from a subtropical environment, which makes preservation and recovery of DNA difficult.
The researchers successfully sequenced nearly 150,000 DNA fragments and aligned them to the giant panda mitochondrial genome reference sequence to recover the Cizhutuo panda’s complete mitochondrial genome. They then used the new genome along with mitochondrial genomes from 138 present-day bears and 32 ancient bears to construct a family tree.
Their analysis shows that the split between the Cizhutuo panda and the ancestor of present-day pandas goes back about 183,000 years. The Cizhutuo panda also possesses 18 mutations that would alter the structure of proteins across six mitochondrial genes. The researchers say those amino acid changes may be related to the ancient panda’s distinct habitat in Guangxi or perhaps climate differences during the Last Glacial Maximum.
The findings suggest that the ancient panda’s maternal lineage had a long and unique history that differed from the maternal lineages leading to present-day panda populations. The researchers say that their success in capturing the mitochondrial genome also suggests that they might successfully isolate and analyze DNA from the ancient specimen’s much more expansive nuclear genome.
“Comparing the Cizhutuo panda’s nuclear DNA to present-day genome-wide data would allow a more thorough analysis of the evolutionary history of the Cizhutuo specimen, as well as its shared history with present-day pandas.”
In even the most fuel-efficient cars, about 60 percent of the total energy of gasoline is lost through heat in the exhaust pipe and radiator. To combat this, researchers are developing new thermoelectic materials that can convert heat into electricity. These semiconducting materials could recirculate electricity back into the vehicle and improve fuel efficiency by up to 5 percent.
The challenge is, current thermoelectric materials for waste heat recovery are very expensive and time consuming to develop. One of the state of the art materials, made from a combination of hafnium and zirconium (elements most commonly used in nuclear reactors), took 15 years from its initial discovery to optimized performance.
Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed an algorithm that can discover and optimize these materials in a matter of months, relying on solving quantum mechanical equations, without any experimental input.
“Semiconducting materials need to have very specific properties to work in this system, including high electrical conductivity, high thermopower, and low thermal conductivity, so that all that heat gets converted into electricity. Our goal was to find a new material that satisfies all the important properties for thermoelectric conversion while at the same time being stable and cheap.”
Kozinsky co-authored the research with Georgy Samsonidze, a research engineer at the Robert Bosch Research and Technology Center in Cambridge, MA, where both authors conducted most of the research.
In order to find such a material, the team developed an algorithm that can predict electronic transport properties of a material based only on the chemical elements used in the crystalline crystal. The key was to simplify the computational approach for electron-phonon scattering and to speed it up by about 10,000 times, compared to existing algorithms.
Using the improved algorithm, the researchers screened many possible crystal structures, including structures that had never been synthesized before. From those, Kozinsky and Samsonidze whittled the list down to several interesting candidates. Of those candidates, the researchers did further computational optimization and sent the top performers to the experimental team.
In an earlier effort experimentalists synthesized the top candidates suggested by these computations and found a material that was as efficient and as stable as previous thermoelectric materials but 10 times cheaper. The total time from initial screening to working devices: 15 months.
“We did in 15 months of computation and experimentation what took 15 years for previous materials to be optimized,” said Kozinsky. “What’s really exciting is that we’re probably not fully understanding the extent of the simplification yet. We could potentially make this method even faster and cheaper.”
Kozinsky said he hopes to improve the new methodology and use it to explore electronic transport in a wider class of new exotic materials such as topological insulators.
When people are out and about, they leave plumes of chemicals behind them — from both car tailpipes and the products they put on their skin and hair. In fact, emissions of siloxane, a common ingredient in shampoos, lotions, and deodorants, are comparable in magnitude to the emissions of major components of vehicle exhaust, such as benzene, from rush-hour traffic in Boulder, Colorado, according to a new CIRES and NOAA study.
This work, is in line with other recent findings that chemical emissions from personal care products can contribute significantly to urban air pollution.
“We detected a pattern of emissions that coincides with human activity: people apply these products in the morning, leave their homes, and drive to work or school. So emissions spike during commuting hours,” said lead author Matthew Coggon, a CIRES scientist at the University of Colorado Boulder working in the NOAA Earth System Research Laboratory.
D5 Siloxane, short for decamethylcyclopentasiloxane, is added to personal care products like shampoos and lotions to give them a smooth, silky feeling. Siloxane belongs to a class of chemicals called volatile organic compounds (VOCs); once applied, they evaporate quickly. In the air, sunlight can trigger those VOCs to react with nitrogen oxides and other compounds to form ozone and particulate matter — two types of pollution that are regulated because of their effects on air quality and human health.
Coggon and his colleagues measured VOCs from the roof of NOAA’s Earth System Research Laboratory in December, 2015 and January, 2017, and from a mobile laboratory driving around Boulder in February, 2016. Among other measurements, they tracked the concentrations of traffic-related compounds, including benzene, commonly used as a marker of vehicle exhaust, during rush hour.
“We were surveying the air, monitoring every species our instrument was sensitive to — about 150 compounds,” said Coggon. From that soup of chemicals, one compound caught their attention. “We found a big peak in the data but we didn’t know what it was,” he said.
suggested siloxane, and he was correct. Because the siloxane emissions correlated with the benzene emissions from traffic, Coggon’s team figured this was also a chemical in vehicle exhaust; so they tested tailpipe emissions directly and took roadside measurements. They couldn’t find it.
Since siloxane and benzene weren’t coming from the same source, Coggon and his colleagues realized that they had nevertheless linked both chemicals to a particular human behavior: Commuting.
By studying their data hour-by-hour, they realized siloxane emissions peaked in the morning, when people put on personal care products and went outside into their cars or buses. That’s when benzene emissions went up too. Emissions of both chemicals decreased during the day, then peaked again during the evening commute. The evening peak of siloxane emissions was lower than in the morning, since the personal care products had largely evaporated throughout the day. “That daily pattern of emissions is what’s key,” Coggon said. “It resembles people’s activities.”
This study is part of an emerging body of research that finds emissions from consumer and industrial products are important sources of urban air pollution. A recent study in Science, led by CIRES and NOAA’s Brian McDonald, found that consumer and industrial products, including personal care products, household cleaners, paints, and pesticides, produced around half of the VOC emissions measured in Los Angeles during the study period.
“This study provides further evidence that as transportation emissions of VOCs have declined, other sources of VOCs, including from personal care products, are emerging as important contributors to urban air pollution,” McDonald said.
The new study also demonstrates that siloxane is a good indicator of of the presence of emissions from personal care products. “Siloxane is a marker,” said Coggon. “Now we have a very good tracer for understanding the emissions patterns of other VOCs emitted from personal care products.” The research team is looking at other chemicals in personal care products that correlate with siloxane — one likely candidate is fragrance compounds. Coggon predicts they may also spike in the morning, as people commute.
“In this changing landscape, emissions from personal care products are becoming important,” Coggon said. “We all have a personal plume, from our cars and our personal care products. It’s likely that emissions from personal care product also affect the air quality in other cities besides Boulder and L.A. Our team wants to learn more about these understudied sources of pollution.”