Step aside carrots, onions and broccoli. The newest heart-healthy vegetable could be a gigantic, record-setting radish. scientists report that compounds found in the Sakurajima Daikon, or “monster,” radish could help protect coronary blood vessels and potentially prevent heart disease and stroke. The finding could lead to the discovery of similar substances in other vegetables and perhaps lead to new drug treatments.
Grown for centuries in Japan, the Sakurajima Daikon is one of the Earth’s most massive vegetables. In 2003, the Guinness Book of World Records certified a Sakurajima weighing nearly 69 pounds as the world’s heaviest radish. Radishes are good sources of antioxidants and reportedly can reduce high blood pressure and the threat of clots, a pair of risk factors for heart attack and stroke. But to date, no studies have directly compared the heart-health benefits of the Sakurajima Daikon to other radishes. To address this knowledge gap, Katsuko Kajiya and colleagues sought to find out what effects this radish would have on nitric oxide production, a key regulator of coronary blood vessel function, and to determine its underlying mechanisms.
The researchers exposed human and pig vascular endothelial cells to extracts from Sakurajima Daikon and smaller radishes. Using fluorescence microscopy and other analytical techniques, the research team found the Sakurajima Daikon radish induced more nitric oxide production in these vascular cells than a smaller Japanese radish. They also identified trigonelline, a plant hormone, as the active component in Sakurajima Daikon that appears to promote a cascade of changes in coronary blood vessels resulting improved nitric oxide production.
glow are unique among bioluminescent animals and entirely unlike those seen in fireflies. The study also examines genes associated with some of the dramatic — and reversible — changes that happen to the fireworms during reproduction.
The beautiful bioluminescence of the Bermuda fireworm , which lives throughout the Caribbean, was first documented in 1492 by Christopher Columbus and his crew just before landing in the Americas. The observations described the lights as “looking like the flame of a small candle alternately raised and lowered.”
The phenomenon went unexplained until the 1930s, when scientists matched the historic description with the unusual and precisely timed mating behavior of fireworms. During summer and autumn, beginning at 22 minutes after sunset on the third night after the full Moon, spawning female fireworms secrete a bright bluish-green luminescence that attracts males. “It’s like they have pocket watches,” said lead author Mercer R. Brugler, a Museum research associate and assistant professor at New York City College of Technology (City Tech).
“The female worms come up from the bottom and swim quickly in tight little circles as they glow, which looks like a field of little cerulean stars across the surface of jet black water,” said Mark Siddall, a curator in the American Museum of Natural History’s Division of Invertebrate Zoology and corresponding author of the study. “Then the males, homing in on the light of the females, come streaking up from the bottom like comets — they luminesce, too. There’s a little explosion of light as both dump their gametes in the water. It is by far the most beautiful biological display I have ever witnessed.”
To further investigate this phenomenon, Siddall, together with Brugler; Michael Tessler, a postdoctoral fellow in the Museum’s Sackler Institute for Comparative Genomics, and M. Teresa Aguado, former postdoctoral fellow in the Museum’s Sackler Institute for Comparative Genomics who is now at the Autonomous University of Madrid, analyzed the transcriptome — the full set of RNA molecules — of a dozen female fireworms from Ferry Reach in Bermuda.
Their findings support previous work showing that fireworms “glow” because of a special luciferase enzyme they produce. These enzymes are the principal drivers of bioluminescence across the tree of life, in organisms as diverse as copepods, fungi, and jellyfish. However, the luciferases found in Bermuda fireworms and their relatives are distinct from those found in any other organism to date.
” The work also took a close look at genes related to the precise reproductive timing of the fireworms, as well as the changes that take place in the animals’ bodies just prior to swarming events. These changes include the enlargement and pigmentation of the worms’ four eyes and the modification of the nephridia — an organ similar to the kidney in vertebrates — to store and release gametes.
As marine mammals evolved to make water their primary habitat, they lost the ability to make a protein that defends humans and other land-dwelling mammals from the neurotoxic effects of a popular human-made pesticide.
The implications of this discovery, announced today in Science, led researchers to call for monitoring our waterways to learn more about the impact of pesticides and agricultural run-off on marine mammals, such as dolphins, manatees, seals and whales. The research also may shed further light on the function of the gene encoding this protein in humans.
“We need to determine if marine mammals are, indeed, at an elevated risk of serious neurological damage from these pesticides because they biologically lack the ability to break them down, or if they’ve somehow adapted to avoid such damage in an as-yet undiscovered way,” associate professor in Pitt’s Department of Computational and Systems Biology, and the Pittsburgh Center for Evolutionary Biology and Medicine. “Either way, this is the kind of serendipitous finding that results from curiosity-driven scientific research. It is helping us to understand what our genes are doing and the impact the environment can have on them.”
a postdoctoral associate in his laboratory, knew from previous research by other scientists that some genes behind smelling and tasting lost their function during the evolution of marine mammals. They set out to see what other genes conserved in land-dwelling mammals had lost function in marine mammals.
By analyzing DNA sequences from five species of marine mammals and 53 species of terrestrial mammals, was the gene that best matched the pattern of losing function in marine mammals while retaining function in all terrestrial mammals. PON1 even beat out several genes responsible for smell and taste, senses that marine mammals don’t rely on much.
In humans and other terrestrial mammals, PON1 reduces cellular damage caused by unstable oxygen atoms. It also protects us from organophosphates, some of which are pesticides that kill insects — which lack PON1 — by disrupting their neurological systems.
Clark and Meyer worked with Joseph Gaspard, Ph.D., director of science and conservation at the Pittsburgh Zoo & PPG Aquarium, now a scientist emeritus at the U.S. Geological Survey’s Wetland and Aquatic Research Center, to obtain marine mammal blood samples from U.S. and international scientists and conservation biologists. Collaborators at the University of Washington reacted blood samples from several marine mammals with an organophosphate byproduct and observed what happened. The blood did not break down the organophosphate byproduct the way it does in land mammals, indicating that, unless a different biological mechanism is protecting the marine mammals, they would be susceptible to “organophosphate poisoning,” a form of poisoning that results from the buildup of chemical signals in the body, especially the brain.
In an attempt to learn why marine mammals lost PON1 function, the researchers traced back when the function was lost in three different groups of marine mammals. Whales and dolphins lost it soon after they split from their common ancestor with hippopotamuses 53 million years ago; manatees lost it after their split from their common ancestor with elephants 64 million years ago. But some seals likely lost PON1 function more recently, at most 21 million years ago and possibly in very recent times.
“The big question is, why did they lose function at PON1 in the first place?” said Meyer. “It’s hard to tell whether it was no longer necessary or whether it was preventing them from adapting to a marine environment. We know that ancient marine environments didn’t have organophosphate pesticides, so we think the loss might instead be related to PON1’s role in responding to the extreme oxidative stress generated by long periods of diving and rapid resurfacing. If we can figure out why these species don’t have functional PON1, we might learn more about the function of PON1 in human health, while also uncovering potential clues to help protect marine mammals most at risk.”
As an example of the potential real-world consequences of losing function at PON1, the researchers explain in their scientific manuscript that in Florida, “agricultural use of organophosphate pesticides is common and runoff can drain into manatee habitats. In Brevard County, where 70 percent of Atlantic Coast manatees are estimated to migrate or seasonally reside, agricultural lands frequently abut manatee protection zones and waterways.”
The scientists believe the next step is to launch a study that directly observes marine mammals during and shortly after periods of excess agricultural organophosphate run-off. Such a project would require increased monitoring of marine mammal habitats, as well as testing of tissues from deceased marine mammals for evidence of organophosphate exposure. The most recent estimate the research team could find of organophosphate levels in manatee habitats in Florida is a decade old, Clark said.
“Marine mammals, such as manatees or bottlenose dolphins, are sentinel species — the canary in the coal mine,” said Clark. “If you follow their health, it will tell you a lot about potential environmental issues that could eventually affect humans.”
The first full characterization measurement of an accelerator beam in six dimensions will advance the understanding and performance of current and planned accelerators around the world.
A team of researchers led by the University of Tennessee, Knoxville conducted the measurement in a beam test facility at the Department of Energy’s Oak Ridge National Laboratory using a replica of the Spallation Neutron Source’s linear accelerator.
“Our goal is to better understand the physics of the beam so that we can improve how accelerators operate,” group leader in ORNL’s Research Accelerator Division and UT joint faculty professor. “Part of that is related to being able to fully characterize or measure a beam in 6D space — and that’s something that, until now, has never been done.”
Six-dimensional space is like 3D space but includes three additional coordinates on the x, y, and z axes to track motion or velocity.
“Right away we saw the beam has this complex structure in 6D space that you can’t see below 5D — layers and layers of complexities that can’t be detangled,” Cousineau said. “The measurement also revealed the beam structure is directly related to the beam’s intensity, which gets more complex as the intensity increases.”
Previous attempts to fully characterize an accelerator beam fell victim to “the curse of dimensionality,” in which measurements in low dimensions become exponentially more difficult in higher dimensions. Scientists have tried to circumvent the issue by adding three 2D measurements together to create a quasi-6D representation. The UT-ORNL team notes that approach is incomplete as a measurement of the beam’s initial conditions entering the accelerator, which determine beam behavior farther down the linac.
As part of efforts to boost the power output of SNS, ORNL physicists used the beam test facility to commission the new radio frequency quadrupole, the first accelerating element located at the linac’s front-end assembly. With the infrastructure already in place, a research grant from the National Science Foundation to the University of Tennessee enabled outfitting the beam test facility with the state-of-the-art 6D measurement capability. Conducting 6D measurements in an accelerator has been limited by the need for multiple days of beam time, which can be a challenge for production accelerators.
“Because we have a replica of the linac’s front-end assembly at the beam test facility, we don’t have to worry about interrupting users’ experiment cycles at SNS. That provides us with unfettered access to perform these time-consuming measurements, which is something we wouldn’t have at other facilities,” said lead author Brandon Cathey, a UT graduate student.
“This result shows the value of combining the freedom and ingenuity of NSF-funded academic research with facilities available through the broad national laboratory complex,” said Vyacheslav Lukin, the NSF program officer who oversees the grant to the University of Tennessee. “There is no better way to introduce a new scientist — a graduate student — to the modern scientific enterprise than by allowing them to lead a first-of-a-kind research project at a facility that uniquely can dissect the particles that underpin what we know and understand about matter and energy.”
The researchers’ ultimate goal is to model the entire beam, including mitigating so-called beam halo, or beam loss — when particles travel to the outer extremes of the beam and are lost. The more immediate challenge, they say, will be finding software tools capable of analyzing the roughly 5 million data points the 6D measurement generated during the 35-hour period.
“When we proposed making a 6D measurement 15 years ago, the problems associated with the curse of dimensionality seemed insurmountable,” said ORNL physicist and coauthor Alexander Aleksandrov. “Now that we’ve succeeded, we’re sure we can improve the system to make faster, higher resolution measurements, adding an almost ubiquitous technique to the arsenal of accelerator physicists everywhere.”
New research shows that for the vast majority of individuals, sodium consumption does not increase health risks except for those who eat more than five grams a day, the equivalent of 2.5 teaspoons of salt.
Fewer than five per cent of individuals in developed countries exceed that level.
The large, international study also shows that even for those individuals there is good news. Any health risk of sodium intake is virtually eliminated if people improve their diet quality by adding fruits, vegetables, dairy foods, potatoes, and other potassium rich foods.
The study followed 94,000 people, aged 35 to 70, for an average of eight years in communities from18 countries around the world and found there an associated risk of cardiovascular disease and strokes only where the average intake is greater than five grams of sodium a day.
China is the only country in their study where 80 per cent of communities have a sodium intake of more than five grams a day. In the other countries, the majority of the communities had an average sodium consumption of 3 to 5 grams a day (equivalent to 1.5 to 2.5 teaspoons of salt).
“The World Health Organization recommends consumption of less than two grams of sodium — that’s one teaspoon of salt — a day as a preventative measure against cardiovascular disease, but there is little evidence in terms of improved health outcomes that individuals ever achieve at such a low level,” said Andrew Mente, first author of the study and a PHRI researcher.
He added that the American Heart Association recommends even less — 1.5 grams of sodium a day for individuals at risk of heart disease.
“Only in the communities with the most sodium intake those over five grams a day of sodium which is mainly in China, did we find a direct link between sodium intake and major cardiovascular events like heart attack and stroke.
“In communities that consumed less than five grams of sodium a day, the opposite was the case. Sodium consumption was inversely associated with myocardial infarction or heart attacks and total mortality, and no increase in stroke.”
Mente added: “We found all major cardiovascular problems, including death, decreased in communities and countries where there is an increased consumption of potassium which is found in foods such as fruits, vegetables, dairy foods, potatoes and nuts and beans.”
The information for the research article came from the ongoing, international Prospective Urban Rural Epidemiology (PURE) study run by the PHRI. Mente is also an associate professor of the Department of Health Research Methods, Evidence and Impact at McMaster University.
Most previous studies relating sodium intake to heart disease and stroke were based on individual-level information, said Martin O’Donnell.
“Public health strategies should be based on best evidence. Our findings demonstrate that community-level interventions to reduce sodium intake should target communities with high sodium consumption, and should be embedded within approaches to improve overall dietary quality.
“There is no convincing evidence that people with moderate or average sodium intake need to reduce their sodium intake for prevention of heart disease and stroke.”
Injecting particles into the atmosphere to cool the planet and counter the warming effects of climate change would do nothing to offset the crop damage from rising global temperatures, according to a new analysis by University of California, Berkeley, researchers.
By analyzing the past effects of Earth-cooling volcanic eruptions, and the response of crops to changes in sunlight, the team concluded that any improvements in yield from cooler temperatures would be negated by lower productivity due to reduced sunlight. The findings have important implications for our understanding of solar geoengineering, one proposed method for helping humanity manage the impacts of global warming.
“Shading the planet keeps things cooler, which helps crops grow better. But plants also need sunlight to grow, so blocking sunlight can affect growth. For agriculture, the unintended impacts of solar geoengineering are equal in magnitude to the benefits,” said lead author Jonathan Proctor, a UC Berkeley doctoral candidate in the Department of Agricultural and Resource Economics. “It’s a bit like performing an experimental surgery; the side-effects of treatment appear to be as bad as the illness.”
“Unknown unknowns make everybody nervous when it comes to global policies, as they should,” said Solomon Hsiang, co-lead author of the study and Chancellor’s Associate Professor of Public Policy at UC Berkeley. “The problem in figuring out the consequences of solar geoengineering is that we can’t do a planetary-scale experiment without actually deploying the technology. The breakthrough here was realizing that we could learn something by studying the effects of giant volcanic eruptions that geoengineering tries to copy.”
Hsiang is director of UC Berkeley’s Global Policy Laboratory, where Proctor is a doctoral fellow.
Proctor and Hsiang will publish their findings online in the journal Nature on August 8.
Some people have pointed to past episodes of global cooling caused by gases emitted during massive volcanic eruptions, such as Mt. Pinatubo in the Philippines in 1991, and argued that humans could purposely inject sulfate aerosols into the upper atmosphere to artificially cool Earth and alleviate the greenhouse warming caused by increased levels of carbon dioxide. Aerosols — in this case, minute droplets of sulfuric acid — reflect a small percentage of sunlight back into space, reducing the temperature a few degrees.
“It’s like putting an umbrella over your head when you’re hot,” Proctor said. “If you put a global sunshade up, it would slow warming.”
Pinatubo, for example, injected about 20 million tons of sulfur dioxide into the atmosphere, reducing sunlight by about 2.5 percent and lowering the average global temperature by about half a degree Celsius (nearly 1 degree Fahrenheit).
The team linked maize, soy, rice and wheat production from 105 countries from 1979-2009 to global satellite observations of these aerosols to study their effect on agriculture. Pairing these results with global climate models, the team calculated that the loss of sunlight from a sulfate-based geoengineering program would cancel its intended benefits of protecting crops from damaging extreme heat.
“It’s similar to using one credit card to pay off another credit card: at the end of the day, you end up where you started without having solved the problem,” Hsiang said.
Some earlier studies suggested that aerosols might improve crop yields also by scattering sunlight and allowing more of the sun’s energy to reach interior leaves typically shaded by upper canopy leaves. This benefit of scattering appears to be weaker than previously thought.
“We are the first to use actual experimental and observational evidence to get at the total impacts that sulfate-based geoengineering might have on yields,” Proctor said. “Before I started the study, I thought the net impact of changes in sunlight would be positive, so I was quite surprised by the finding that scattering light decreases yields.”
Despite the study’s conclusions, Proctor said, “I don’t think we should necessarily write off solar geoengineering. For agriculture, it might not work that well, but there are other sectors of the economy that could potentially benefit substantially.”
Proctor and Hsiang noted that their methods could be used to investigate the impact of geoengineering on other segments of the economy, human health and the functioning of natural ecosystems.
They did not address other types of geoengineering, such as capture and storage of carbon dioxide, or issues surrounding geoengineering, such as its impact on Earth’s protective ozone layer and who gets to set Earth’s thermostat.
“Society needs to be objective about geoengineering technologies and develop a clear understanding of the potential benefits, costs and risks,” Proctor said. “At present, uncertainty about these factors dwarfs what we understand.”
The authors emphasize the need for more research into the human and ecological consequences of geoengineering, both good and bad.
“The most certain way to reduce damages to crops and, in turn, people’s livelihood and well-being, is reducing carbon emissions,” Proctor said.
“Perhaps what is most important is that we have respect for the potential scale, power and risks of geoengineering technologies,” Hsiang said. “Sunlight powers everything on the planet, so we must understand the possible outcomes if we are going to try to manage it.”
A new study led by scientists at the University of Bristol has warned that unless we mitigate current levels of carbon dioxide emissions, Western Europe and New Zealand could revert to the hot tropical climate of the early Paleogene period — 56-48 million years ago.
As seen from the ongoing heat wave, the knock-on effects of such extreme warmth include arid land and fires as well as impacts on health and infrastructure.
The early Paleogene is a period of great interest to climate change scientists as carbon dioxide levels (around 1,000 ppmv) are similar to those predicted for the end of this century.
Dr David Naafs from the University of Bristol’s School of Earth Sciences, “We know that the early Paleogene was characterised by a greenhouse climate with elevated carbon dioxide levels.
“Most of the existing estimates of temperatures from this period are from the ocean, not the land — what this study attempts to answer is exactly how warm it got on land during this period.”
Scientists used molecular fossils of microorganisms in ancient peat (lignite) to provide estimates of land temperature 50 million-years ago. This demonstrated that annual land temperatures in Western Europe as well as New Zealand were actually higher than previously thought — between 23 and 29 °C — this is currently 10 to 15 °C higher than current average temperatures in these areas.
These results suggest that temperatures similar to those of the current heat wave that is influencing western Europe and other regions would become the new norm by the end of this century if CO2 levels in the atmosphere continue to increase.
Professor Rich Pancost, Co-author and Director of the University of Bristol Cabot Institute, added: “Our work adds to the evidence for a very hot climate under potential end-of-century carbon dioxide levels. “Importantly, we also study how the Earth system responded to that warmth. For example, this and other hot time periods were associated with evidence for arid conditions and extreme rainfall events.”
The research team will now turn their attentions to geographical areas in lower-latitudes to see how hot land temperatures were there.
Dr Naafs said: “Did the tropics, for example, become ecological dead zones because temperatures in excess of 40 °C were too high for most form of life to survive?
“Some climate models suggest this, but we currently lack critical data.
“Our results hint at the possibility that the tropics, like the mid-latitudes, were hotter than present, but more work is needed to quantify temperatures from these regions.”
Obscured by thick clouds of absorbing dust, the closest supermassive black hole to the Earth lies 26,000 light years away at the centre of the Milky Way. This gravity monster, which has a mass four million times that of the Sun, is surrounded by a small group of stars orbiting at high speed. This extreme environment — the strongest gravitational field in our galaxy — makes it the perfect place to test gravitational physics, particularly Einstein’s general theory of relativity.
New infrared observations from the exquisitely sensitive GRAVITY, NACO and SINFONI instruments on ESO’s Very Large Telescope (VLT) have now allowed astronomers to follow one of these stars, called S2, as it passed very close to the black hole during May 2018 at a speed in excess of 25 million kilometres per hour — three percent of the speed of light — and at a distance of less than 20 billion kilometres.
These extremely delicate measurements were made by an international team led by Reinhard Genzel of the Max Planck Institute for extraterrestrial physics (MPE) in Garching, Germany, in conjunction with collaborators around the world. The observations form the culmination of a 26-year series of ever more precise observations of the centre of the Milky Way using ESO instruments. ‘This is the second time that we have observed the close passage of S2 around the black hole in our galactic centre. But this time, because of much improved instrumentation, we were able to observe the star with unprecedented resolution’, explains Genzel. ‘We have been preparing intensely for this event over several years, as we wanted to make the most of this unique opportunity to observe general relativistic effects.’
The new measurements clearly reveal an effect called gravitational redshift. Light from the star is stretched to longer wavelengths by the very strong gravitational field of the black hole. And the stretch in wavelength of light from S2 agrees precisely with that predicted by Einstein’s theory of general relativity. This is the first time that this deviation from the predictions of simpler Newtonian gravity has been observed in the motion of a star around a supermassive black hole. The team used SINFONI to measure the motion of S2 towards and away from Earth and the GRAVITY interferometric instrument to make extraordinarily precise measurements of the position of S2 in order to define the shape of its orbit. GRAVITY creates such sharp images that it can reveal the motion of the star from night to night as it passes close to the black hole — 26,000 light years from Earth.
‘Our first observations of S2, about two years ago, already showed that we would have the ideal black hole laboratory’, adds Frank Eisenhauer (MPE), Co-Principal Investigator of the GRAVITY instrument. ‘During the close passage, we managed not only to precisely follow the star on its orbit, we could even detect the faint glow around the black hole on most of the images.’ By combining the position and velocity measurements from SINFONI and GRAVITY, as well as previous observations using other instruments, the team could compare them to the predictions of Newtonian gravity, general relativity and other theories of gravity. As expected, the new results are inconsistent with Newtonian predictions and in excellent agreement with the predictions of general relativity. More than one hundred years after he published his paper setting out the equations of general relativity, Einstein has been proved right once more.
The hardware contribution of the Institute of Physics I of the University of Cologne was the development and construction of the two spectrometers of GRAVITY. The spectrometers analyse the wavelength of the observed stellar light and convert the received photons into electronic signals. ‘GRAVITY is a technological challenge. However, after more than two decades of astrophysical research on the high velocity stars in the Galactic Centre and on the development of astronomical instrumentation, the effort has been rewarded with an excellent result in experimental physics’, says Andreas Eckhart from the University of Cologne.
Continuing observations are expected to reveal another relativistic effect later in the year a small rotation of the star’s orbit, known as Schwarzschild precession as S2 moves away from the black hole.
There may be more habitable planets in the universe than we previously thought, who suggest that plate tectonics — long assumed to be a requirement for suitable conditions for life — are in fact not necessary.
When searching for habitable planets or life on other planets, scientists look for biosignatures of atmospheric carbon dioxide. On Earth, atmospheric carbon dioxide increases surface heat through the greenhouse effect. Carbon also cycles to the subsurface and back to the atmosphere through natural processes.
“Volcanism releases gases into the atmosphere, and then through weathering, carbon dioxide is pulled from the atmosphere and sequestered into surface rocks and sediment,” said Bradford Foley, assistant professor of geosciences. “Balancing those two processes keeps carbon dioxide at a certain level in the atmosphere, which is really important for whether the climate stays temperate and suitable for life.”
Most of Earth’s volcanoes are found at the border of tectonic plates, which is one reason scientists believed they were necessary for life. Subduction, in which one plate is pushed deeper into the subsurface by a colliding plate, can also aid in carbon cycling by pushing carbon into the mantle.
Planets without tectonic plates are known as stagnant lid planets. On these planets, the crust is one giant, spherical plate floating on mantle, rather than separate pieces. These are thought to be more widespread than planets with plate tectonics. In fact, Earth is the only planet with confirmed tectonic plates.
Foley and Andrew Smye, assistant professor of geosciences, created a computer model of the lifecycle of a planet. They looked at how much heat its climate could retain based on its initial heat budget, or the amount of heat and heat-producing elements present when a planet forms. Some elements produce heat when they decay. On Earth, decaying uranium produces thorium and heat, and decaying thorium produces potassium and heat.
After running hundreds of simulations to vary a planet’s size and chemical composition, the researchers found that stagnant lid planets can sustain conditions for liquid water for billions of years. At the highest extreme, they could sustain life for up to 4 billion years, roughly Earth’s life span to date.
“You still have volcanism on stagnant lid planets, but it’s much shorter lived than on planets with plate tectonics because there isn’t as much cycling,” said Smye. “Volcanoes result in a succession of lava flows, which are buried like layers of a cake over time. Rocks and sediment heat up more the deeper they are buried.”
The researchers found that at high enough heat and pressure, carbon dioxide gas can escape from rocks and make its way to the surface, a process known as degassing. On Earth, Smye said, the same process occurs with water in subduction fault zones.
This degassing process increases based on what types and quantities of heat-producing elements are present in a planet up to a certain point, said Foley.
“There’s a sweet spot range where a planet is releasing enough carbon dioxide to keep the planet from freezing over, but not so much that the weathering can’t pull carbon dioxide out of the atmosphere and keep the climate temperate,” he said.
According to the researchers’ model, the presence and amount of heat-producing elements were far better indicators for a planet’s potential to sustain life.
“One interesting take-home point of this study is that the initial composition or size of a planet is important in setting the trajectory for habitability.” “The future fate of a planet is set from the outset of its birth.”
A new way of arranging advanced computer components called memristors on a chip could enable them to be used for general computing, which could cut energy consumption by a factor of 100.
This would improve performance in low power environments such as smartphones or make for more efficient supercomputers.
“Historically, the semiconductor industry has improved performance by making devices faster. But although the processors and memories are very fast, they can’t be efficient because they have to wait for data to come in and out,” said Wei Lu, U-M professor of electrical and computer engineering and co-founder of memristor startup Crossbar Inc.
Memristors might be the answer. Named as a portmanteau of memory and resistor, they can be programmed to have different resistance states — meaning they store information as resistance levels. These circuit elements enable memory and processing in the same device, cutting out the data transfer bottleneck experienced by conventional computers in which the memory is separate from the processor.
However, unlike ordinary bits, which are 1 or 0, memristors can have resistances that are on a continuum. Some applications, such as computing that mimics the brain (neuromorphic), take advantage of the analog nature of memristors. But for ordinary computing, trying to differentiate among small variations in the current passing through a memristor device is not precise enough for numerical calculations.
Lu and his colleagues got around this problem by digitizing the current outputs — defining current ranges as specific bit values (i.e., 0 or 1). The team was also able to map large mathematical problems into smaller blocks within the array, improving the efficiency and flexibility of the system.
Computers with these new blocks, which the researchers call “memory-processing units,” could be particularly useful for implementing machine learning and artificial intelligence algorithms. They are also well suited to tasks that are based on matrix operations, such as simulations used for weather prediction. The simplest mathematical matrices, akin to tables with rows and columns of numbers, can map directly onto the grid of memristors.
Once the memristors are set to represent the numbers, operations that multiply and sum the rows and columns can be taken care of simultaneously, with a set of voltage pulses along the rows. The current measured at the end of each column contains the answers. A typical processor, in contrast, would have to read the value from each cell of the matrix, perform multiplication, and then sum up each column in series.
“We get the multiplication and addition in one step. It’s taken care of through physical laws. We don’t need to manually multiply and sum in a processor,” Lu said.
His team chose to solve partial differential equations as a test for a 32×32 memristor array — which Lu imagines as just one block of a future system. These equations, including those behind weather forecasting, underpin many problems science and engineering but are very challenging to solve. The difficulty comes from the complicated forms and multiple variables needed to model physical phenomena.
When solving partial differential equations exactly is impossible, solving them approximately can require supercomputers. These problems often involve very large matrices of data, so the memory-processor communication bottleneck is neatly solved with a memristor array. such as those used for integrated circuit fabrication.