Doctor

Galaxy outskirts likely hunting grounds for dying massive stars and black holes

Posted on


180726160951_1_540x360

Findings from a Rochester Institute of Technology study provide further evidence that the outskirts of spiral galaxies host massive black holes. These overlooked regions are new places to observe gravitational waves created when the massive bodies collide.

The study winds back time on massive black holes by analyzing their visible precursors — supernovae with collapsing cores. The slow decay of these massive stars creates bright signatures in the electromagnetic spectrum before stellar evolution ends in black holes.

Using data from the Lick Observatory Supernova Search, a survey of nearby galaxies, the team compared the supernovae rate in outer spiral galaxies with that of known hosts — dwarf/satellite galaxies — and found comparable numbers for typical spiral outskirts and typical dwarf galaxies, roughly two core-collapse supernovae per millennium.

Low levels of elements heavier than hydrogen and helium found in dwarf/satellite galaxies create favorable conditions for massive black holes to form and create binary pairs. A similar galactic environment in the outer disks of spiral galaxies also creates likely hunting grounds for massive black holes, said Sukanya Chakrabarti, lead author and assistant professor in the RIT School of Physics and Astronomy.

“If these core-collapse supernovae are the predecessors to the binary black holes detected by LIGO (Laser Interferometer Gravitational-wave Observatory), then what we’ve found is a reliable method of identifying the host galaxies of LIGO sources,” said Chakrabarti. “Because these black holes have an electromagnetic counterpart at an earlier stage in their life, we can pinpoint their location in the sky and watch for massive black holes.”

The study’s findings complement Chakrabarti’s 2017 study, which showed that the outer parts of spiral galaxies could contribute to LIGO detection rates. The regions form stars at a comparable rate to dwarf galaxies and are low in heavy element content, creating a conducive home for massive black holes. The current study isolates potential candidates within these favorable galactic environments.

“We see now that these are both important contributors.” “The next step is to do deeper surveys to see if we can improve the rate.”

“This work may help us determine which galaxies to be on the lookout for electromagnetic counterparts of massive black holes.”

Thin gap on stellar family portrait

Posted on


180726161004_1_540x360

A thin gap has been discovered on the Hertzsprung-Russell Diagram (HRD), the most fundamental of all maps in stellar astronomy, a finding that provides new information about the interior structures of low mass stars in the Milky Way Galaxy.

Just as a graph can be made of people with different heights and weights, astronomers compare stars using their luminosities and temperatures. The HRD is a “family portrait” of the stars in the Galaxy, where stars such as the Sun, Altair, Alpha Centauri, Betelgeuse, the north star Polaris and Sirius can be compared. The newly discovered gap cuts diagonally across the HRD and indicates where a crucial internal change occurs in the structures of stars. The gap outlines where stars transition from being larger and mostly convective with a thin radiative layer to being smaller and fully convective.

Radiation and convection are two ways to transfer energy from inside a star to its surface. Radiation transfers energy through space, and convection is the transfer of energy from one place to another by the movement of fluid.

The researchers estimate that stars above the gap contain more than about one-third the mass of the Sun, and those below have less mass. Because different types of stars have different masses, this feature reveals where different types of interior structures are on the HRD. The gap occurs in the middle of the region of “red dwarf” stars, which are much smaller and cooler than the Sun, but compose three of every four stars in the solar neighborhood.

“We were pretty excited to see this result, and it provides us new insights to the structures and evolution of stars,” first author of the study and a staff astronomer in the Department of Physics and Astronomy at Georgia State.

In 2013, the European Space Agency (ESA) launched the Gaia spacecraft to make a census of the stars in the Milky Way Galaxy and to create a three-dimensional map. In April 2018, the ESA released results of this mission, revealing an unprecedented map of more than one billion stars in the Galaxy, a 10,000-fold increase in the number of stars with accurate distances. The research team led by Georgia State plotted nearly 250,000 of the closest stars in the Gaia data on the HRD to reveal the gap. Georgia State’s researchers have studied the distances to nearby stars for years, which enabled them to interpret the results and notice this thin gap.

Using results from a theoretical computer model that simulates the activity inside the stars, it appears the gap is caused by a slight shrinking in size if a star is convective all the way through.

X-ray technology reveals never-before-seen matter around black hole

Posted on


180728084103_1_540x360.jpg

In an international collaboration between Japan and Sweden, scientists clarified how gravity affects the shape of matter near the black hole in binary system Cygnus X-1. may help scientists further understand the physics of strong gravity and the evolution of black holes and galaxies.

Near the center of the constellation of Cygnus is a star orbiting the first black hole discovered in the universe. Together, they form a binary system known as Cygnus X-1. This black hole is also one of the brightest sources of X-rays in the sky. However, the geometry of matter that gives rise to this light was uncertain. The research team revealed this information from a new technique called X-ray polarimetry.

Taking a picture of a black hole is not easy. For one thing, it is not yet possible to observe a black hole because light cannot escape it. Rather, instead of observing the black hole itself, scientists can observe light coming from matter close to the black hole. In the case of Cygnus X-1, this matter comes from the star that closely orbits the black hole.

Most light that we see, like from the sun, vibrates in many directions. Polarization filters light so that it vibrates in one direction. It is how snow goggles with polarized lenses let skiers see more easily where they are going down the mountain — they work because the filter cuts light reflecting off of the snow.

“It’s the same situation with hard X-rays around a black hole,” Hiroshima University Assistant Professor and study coauthor Hiromitsu Takahashi said. “However, hard X-rays and gamma rays coming from near the black hole penetrate this filter. There are no such ‘goggles’ for these rays, so we need another special kind of treatment to direct and measure this scattering of light.”

The team needed to figure out where the light was coming from and where it scattered. In order to make both of these measurements, they launched an X-ray polarimeter on a balloon called PoGO+. From there, the team could piece together what fraction of hard X-rays reflected off the accretion disk and identify the matter shape.

Two competing models describe how matter near a black hole can look in a binary system such as Cygnus X-1: the lamp-post and extended model. In the lamp-post model, the corona is compact and bound closely to the black hole. Photons bend toward the accretion disk, resulting in more reflected light. In the extended model, the corona is larger and spread around the vicinity of the black hole. In this case, the reflected light by the disk is weaker.

Since light did not bend that much under the strong gravity of the black hole, the team concluded that the black hole fit the extended corona model.

With this information, the researchers can uncover more characteristics about black holes. One example is its spin. The effects of spin can modify the space-time surrounding the black hole. Spin could also provide clues into the evolution of the black hole. It could be slowing down in speed since the beginning of the universe, or it could be accumulating matter and spinning faster.

“The black hole in Cygnus is one of many.”  “We would like to study more black holes using X-ray polarimetry, like those closer to the center of galaxies. Maybe we better understand black hole evolution, as well as galaxy evolution.”

Optical neural network demo

Posted on


180728084146_1_540x360.jpg

Researchers at the National Institute of Standards and Technology (NIST) have made a silicon chip that distributes optical signals precisely across a miniature brain-like grid, showcasing a potential new design for neural networks.

The human brain has billions of neurons (nerve cells), each with thousands of connections to other neurons. Many computing research projects aim to emulate the brain by creating circuits of artificial neural networks. But conventional electronics, including the electrical wiring of semiconductor circuits, often impedes the extremely complex routing required for useful neural networks.

The NIST team proposes to use light instead of electricity as a signaling medium. Neural networks already have demonstrated remarkable power in solving complex problems, including rapid pattern recognition and data analysis. The use of light would eliminate interference due to electrical charge and the signals would travel faster and farther.

“Light’s advantages could improve the performance of neural nets for scientific data analysis such as searches for Earth-like planets and quantum information science, and accelerate the development of highly intuitive control systems for autonomous vehicles.”

A conventional computer processes information through algorithms, or human-coded rules. a neural network relies on a network of connections among processing elements, or neurons, which can be trained to recognize certain patterns of stimuli. A neural or neuromorphic computer would consist of a large, complex system of neural networks.

Described in a new paper, the NIST chip overcomes a major challenge to the use of light signals by vertically stacking two layers of photonic waveguides — structures that confine light into narrow lines for routing optical signals, much as wires route electrical signals. This three-dimensional (3D) design enables complex routing schemes, which are necessary to mimic neural systems. Furthermore, this design can easily be extended to incorporate additional waveguiding layers when needed for more complex networks.

The stacked waveguides form a three-dimensional grid with 10 inputs or “upstream” neurons each connecting to 10 outputs or “downstream” neurons, for a total of 100 receivers. Fabricated on a silicon wafer, the waveguides are made of silicon nitride and are each 800 nanometers (nm) wide and 400 nm thick. Researchers created software to automatically generate signal routing, with adjustable levels of connectivity between the neurons.

Laser light was directed into the chip through an optical fiber. The goal was to route each input to every output group, following a selected distribution pattern for light intensity or power. Power levels represent the pattern and degree of connectivity in the circuit. The authors demonstrated two schemes for controlling output intensity: uniform (each output receives the same power) and a “bell curve” distribution (in which middle neurons receive the most power, while peripheral neurons receive less).

To evaluate the results, researchers made images of the output signals. All signals were focused through a microscope lens onto a semiconductor sensor and processed into image frames. This method allows many devices to be analyzed at the same time with high precision. The output was highly uniform, with low error rates, confirming precise power distribution.

“We’ve really done two things here,” Chiles said. “We’ve begun to use the third dimension to enable more optical connectivity, and we’ve developed a new measurement technique to rapidly characterize many devices in a photonic system. Both advances are crucial as we begin to scale up to massive optoelectronic neural systems.”

The big picture: Mouse memory cells are about experience, not place

Posted on


180726162738_1_540x360.jpg

When it comes to memory, it’s more than just “location, location, location.” New research suggests that the brain doesn’t store all memories in ‘place cells’, the main type of neuron in the hippocampus, a structure crucial for navigation and memory. Instead..

The hippocampus is well-known as the domain of place cells, whose discovery and function as mental maps of space was recognized with the 2014 Nobel Prize in Physiology or Medicine. On the other hand, as a hotspot for memory research, the hippocampus is proposed as the physical location for memories of experiences, stored in engram cells. “Neuroscience is still grappling with the engram memory concept,” says research group leader Thomas McHugh of the RIKEN Center for Brain Science in Japan. “We know what these cells do when they’re activated, but what do they represent and how do they function?”

The assumption is that memory engrams are just place cells, but McHugh’s group think they have an alternative explanation. In their experiments, mice spent time in one kind of cage to make a memory of that environment. The researchers used optogenetic methods to identify the cells that were active during that time and therefore contributed to the memory. These cells represented only a fraction of hippocampal place cells and had larger place fields — the corresponding real-world area that gets the cell excited when the mouse is exploring. Analysis of the activity across a large number of cells revealed that, while most place cells kept the same spatial map during both the initial and later visit to the cage, the engram cells had uncorrelated activity between the two time points. The only exception was very early during both visits when the cells’ activity was similar, which is what you would expect if they are involved in recall of the context.

When the mice were placed in a second, different cage, the engram cells remained inactive — they were already ‘occupied’ with the previous memory. In fact, the researchers were able to tell the first and second environments apart, just by comparing the activity of these cells. The engram cells are only active for the memory of the context itself, not to specific locations, while place cells on the other hand are active during exploration, creating and updating a spatial map. Recognizing a context or environment doesn’t require walking through or exploring, though, so location cells thus appear to be distinct from memory cells.

The instability of the spatial information signaled by engram cells compared with the majority of place cells indicates that they deal with the ‘big picture’, the macro scale of a context and not a specific location therein. The researchers propose that engram cells may not store memories per se but to act as an index that ties memory-relevant details together, wherever else in the brain those may be. “Their role is to track elements of a memory, whether those are from sound or vision or other senses, and then trigger their recall by activating other parts of the brain like the cortex,” McHugh hypothesizes. While the hippocampus clearly does underlie spatial memory, this newly revealed function as an index for contextual identity shows that this brain region is about more than just maps. “We long assumed memory is anchored to stable representations of locations,” “but it’s actually the opposite.”

Mars Express detects liquid water hidden under planet’s south pole

Posted on


180726085046_1_540x360

Radar data collected by ESA’s Mars Express point to a pond of liquid water buried under layers of ice and dust in the south polar region of Mars.

Evidence for the Red Planet’s watery past is prevalent across its surface in the form of vast dried-out river valley networks and gigantic outflow channels clearly imaged by orbiting spacecraft. Orbiters, together with landers and rovers exploring the martian surface, also discovered minerals that can only form in the presence of liquid water.

But the climate has changed significantly over the course of the planet’s 4.6 billion year history and liquid water cannot exist on the surface today, so scientists are looking underground. Early results from the 15-year old Mars Express spacecraft already found that water-ice exists at the planet’s poles and is also buried in layers interspersed with dust.

The presence of liquid water at the base of the polar ice caps has long been suspected; after all, from studies on Earth, it is well known that the melting point of water decreases under the pressure of an overlying glacier. Moreover, the presence of salts on Mars could further reduce the melting point of water and keep the water liquid even at below-freezing temperatures.

But until now evidence from the Mars Advanced Radar for Subsurface and Ionosphere Sounding instrument,  the first radar sounder ever to orbit another planet, remained inconclusive.

It has taken the persistence of scientists working with this subsurface-probing instrument to develop new techniques in order to collect as much high-resolution data as possible to confirm their exciting conclusion.

Ground-penetrating radar uses the method of sending radar pulses towards the surface and timing how long it takes for them to be reflected back to the spacecraft, and with what strength. The properties of the material that lies between influences the returned signal, which can be used to map the subsurface topography.

The radar investigation shows that south polar region of Mars is made of many layers of ice and dust down to a depth of about 1.5 km in the 200 km-wide area analysed in this study. A particularly bright radar reflection underneath the layered deposits is identified within a 20 km-wide zone.

Analysing the properties of the reflected radar signals and considering the composition of the layered deposits and expected temperature profile below the surface, the scientists interpret the bright feature as an interface between the ice and a stable body of liquid water, which could be laden with salty, saturated sediments. For MARSIS to be able to detect such a patch of water, it would need to be at least several tens of centimetres thick.

“This subsurface anomaly on Mars has radar properties matching water or water-rich sediments.”

“This is just one small study area; it is an exciting prospect to think there could be more of these underground pockets of water elsewhere, yet to be discovered.”

“We’d seen hints of interesting subsurface features for years but we couldn’t reproduce the result from orbit to orbit, because the sampling rates and resolution of our data was previously too low.”

“We had to come up with a new operating mode to bypass some onboard processing and trigger a higher sampling rate and thus improve the resolution of the footprint of our dataset: now we see things that simply were not possible before.”

The finding is somewhat reminiscent of Lake Vostok, discovered some 4 km below the ice in Antarctica on Earth. Some forms of microbial life are known to thrive in Earth’s subglacial environments, but could underground pockets of salty, sediment-rich liquid water on Mars also provide a suitable habitat, either now or in the past? Whether life has ever existed on Mars remains an open question, and is one that Mars missions, including the current European-Russian ExoMars orbiter and future rover, will continue to explore.

“The long duration of Mars Express, and the exhausting effort made by the radar team to overcome many analytical challenges, enabled this much-awaited result, demonstrating that the mission and its payload still have a great science potential,” says Dmitri Titov, ESA’s Mars Express project scientist.

“This thrilling discovery is a highlight for planetary science and will contribute to our understanding of the evolution of Mars, the history of water on our neighbour planet and its habitability.”

 

Yellowstone super-volcano has a different history than previously thought

Posted on


180726085827_1_540x360.jpg

The long-dormant Yellowstone super-volcano in the American West has a different history than previously thought, according to a new study by a Virginia Tech geoscientist.

Scientists have long thought that Yellowstone Caldera, part of the Rocky Mountains and located mostly in Wyoming, is powered by heat from the Earth’s core, similar to most volcanoes such as the recently active Kilauea volcano in Hawaii.

“In this research, there was no evidence of heat coming directly up from the Earth’s core to power the surface volcano at Yellowstone,” Zhou said. “Instead, the underground images we captured suggest that Yellowstone volcanoes were produced by a gigantic ancient oceanic plate that dove under the Western United States about 30 million years ago. This ancient oceanic plate broke into pieces, resulting in perturbations of unusual rocks in the mantle which led to volcanic eruptions in the past 16 million years.”

The eruptions were very explosive, Zhou added. A theoretical seismologist, Zhou created X-ray-like images of the Earth’s deep interior from USArray — part of the Earthscope project funded by the National Science Foundation — and discovered an anomalous underground structure at a depth of about 250 to 400 miles right beneath the line of volcanoes.

In her study, Zhou found the new images of the Earth’s deep interior showed that the oceanic Farallon plate, which used to be where the Pacific Ocean is now, wedged itself beneath the present-day Western United States. The ancient oceanic plate was broken into pieces just like the seafloor in the Pacific today. A section of the subducted oceanic plate started tearing off and sinking down to the deep earth.

The sinking section of oceanic plate slowly pushed hot materials upward to form the volcanoes that now make up Yellowstone. Further, the series of volcanoes that make up Yellowstone have been slowly moving, achingly so, ever since. “The process started at the Oregon-Idaho border about 16 million years ago and propagated northwestward, forming a line of volcanoes that are progressively younger as they stretched northwest to present-day Wyoming.”

The previously-held plume model was used to explain the unique Yellowstone hotspot track — the line of volcanoes in Oregon, Idaho, and Wyoming that dots part of the Midwest. “If the North American plate was moving slowly over a position-fixed plume at Yellowstone, it will displace older volcanoes towards the Oregon-Idaho border and form a line of volcanoes, but such a deep plume has not been found.” Zhou said. So, what caused the track? Zhou intends to find out.

“It has always been a problem there, and scientists have tried to come up with different ways to explain the cause of Yellowstone volcanoes, but it has been unsuccessful,” she said, adding that hotspot tracks are more popular in oceans, such as the Hawaii islands. The frequent Geyser eruptions at Yellowstone are of course not volcanic eruptions with magna, but due to super-heated water. The last Yellowstone super eruption was about 630,000 years ago, according to experts. Zhou has no predictions on when or if Yellowstone could erupt again.

The use of the X-ray-like images for this study is unique in itself. Just as humans can see objects in a room when a light is on, Zhou said seismometers can see structures deep within the earth when an earthquake occurs. The vibrations spread out and create waves when they hit rocks. The waves are detected by seismometers and used in what is known as diffraction tomography.

“This is the first time the new imaging theory has been applied to this type of seismic data, which allowed us to see anomalous structures in the Earth’s mantle that would otherwise not be resolvable using traditional methods,” Zhou said.

Zhou will continue her study of Yellowstone. “The next step will be to increase the resolution of the X-ray-like images of the underground rock,” she added.

“More detailed images of the unusual rocks in the deep earth will allow us to use computer simulation to recreate the fragmentation of the gigantic oceanic plate and test different scenarios of how rock melting and magma feeding system work for the Yellowstone volcanoes.”

Time is running out in the tropics: Researchers warn of global biodiversity collapse

Posted on


180726085918_1_540x360.jpg

A global biodiversity collapse is imminent unless we take urgent, concerted action to reverse species loss in the tropics.

In their paper ‘The future of hyperdiverse tropical ecosystems’ an international team has warned that a failure to act quickly and decisively will greatly increase the risk of unprecedented and irrevocable species loss in the most diverse parts of the planet.

The study is the first high-level report on the state of all four of the world´s most diverse tropical ecosystems — tropical forests, savannas, lakes and rivers, and coral reefs.

The authors found that although the tropics cover just 40% of the planet, they are home to more than three-quarters of all species including almost all shallow-water corals and more than 90% of the world’s bird species. Most of these species are found nowhere else, and millions more are as yet unknown to science.

“At the current rate of species description — about 20,000 new species per year — it can be estimated that at least 300 years will be necessary to catalogue biodiversity,” said Dr. Benoit Guénard, Assistant Professor of the University of Hong Kong and an author of the study.

And across tropical ecosystems, many species face the ‘double jeopardy’ of being harmed by both local human pressures — such as overfishing or selective logging — and droughts or heatwaves linked to climate change.

Dr Alexander Lees, from Manchester Metropolitan University explained that while over-harvesting of wildlife was responsible for the annual loss of millions of highly trafficked animals such as pangolins, it also affected many other less-well known species.

He said: “Even many small songbirds are at risk of imminent global extinction due to their capture for the pet trade in South East Asia. The rainforests where they live are increasingly falling silent.”

The declining health of tropical ecosystems also threatens the well-being of millions of people across the planet.

Lead author Professor Jos Barlow from Lancaster University said: “Although they cover just 0.1% of the ocean surface, coral reefs provide fish resources and coastal protection for up to 200 million people. And between them, humid tropical forests and savannas store 40% of the carbon in the terrestrial biosphere and support rainfall in some of the world’s most important agricultural regions.

Whilst the conclusions are bleak, the study also outlined the actions that are needed to turn the health of these vital ecosystems around.

The researchers have called for a step-change in efforts to support sustainable development and effective conservation interventions to preserve and restore the tropical habitats that have been the home and last refuge to the overwhelming majority of Earth’s biodiversity for millions of years.

Professor Barlow said: “The fate of the tropics will be largely determined by what happens elsewhere in the planet. While most of us are familiar with the impact of climate change on the polar regions, it is also having devastating consequences across the tropics — and without urgent action could undermine local conservation interventions.”

Dr Christina Hicks from Lancaster University said, as a powerful economic driver of change, the role of developed countries was also felt deeply in the tropics.

She said: “Conservation strategies must address the underlying drivers of environmental change whilst avoiding exacerbating deeply rooted inequalities. Environmental aid has remained static in recent years, and remains a drop in the ocean compared to the income generated by resource extraction.”

He said: “The past decades have seen a boom in proposals, innovations and insights about the science, governance and management of tropical ecosystems from remote sensing and big data to new legal frameworks for business. The clock is ticking for these proposals and insights to be properly tested.”

 

 

 

 

Reversing cause and effect is no trouble for quantum computers

Posted on


180719094405_1_540x360.jpg

Watch a movie backwards and you’ll likely get confused — but a quantum computer wouldn’t. That’s the conclusion of researcher Mile Gu at the Centre for Quantum Technologies (CQT) at the National University of Singapore and Nanyang Technological University and collaborators.

The international team show that a quantum computer is less in thrall to the arrow of time than a classical computer. In some cases, it’s as if the quantum computer doesn’t need to distinguish between cause and effect at all.

The new work is inspired by an influential discovery made almost ten years ago by complexity scientists James Crutchfield and John Mahoney at the University of California, Davis. They showed that many statistical data sequences will have a built-in arrow of time. An observer who sees the data played from beginning to end, like the frames of a movie, can model what comes next using only a modest amount of memory about what occurred before. An observer who tries to model the system in reverse has a much harder task — potentially needing to track orders of magnitude more information.

This discovery came to be known as ‘causal asymmetry’. It seems intuitive. After all, modelling a system when time is running backwards is like trying to infer a cause from an effect. We are used to finding that more difficult than predicting an effect from a cause. In everyday life, understanding what will happen next is easier if you know what just happened, and what happened before that.

However, researchers are always intrigued to discover asymmetries that are linked to time-ordering. This is because the fundamental laws of physics are ambivalent about whether time moves forwards or in reverse.

“When the physics does not impose any direction on time, where does causal asymmetry — the memory overhead needed to reverse cause and effect — come from?” asks Gu.

The first studies of causal asymmetry used models with classical physics to generate predictions. Crutchfield and Mahoney teamed up with Gu and collaborators Jayne Thompson, Andrew Garner and Vlatko Vedral at CQT to find out whether quantum mechanics changes the situation.

They found that it did. Models that use quantum physics, the team prove, can entirely mitigate the memory overhead. A quantum model forced to emulate the process in reverse-time will always outperform a classical model modelling the process in forward-time.

The work has some profound implications. “The most exciting thing for us is the possible connection with the arrow of time,”  first author on the work. “If causal asymmetry is only found in classical models, it suggests our perception of cause and effect, and thus time, can emerge from enforcing a classical explanation on events in a fundamentally quantum world.”

Next the team wants to understand how this connects to other ideas of time. “Every community has their own arrow of time, and everybody wants to explain where they come from,” Crutchfield and Mahoney called causal asymmetry an example of time’s ‘barbed arrow’.

Most iconic is the ‘thermodynamic arrow’. It comes from the idea that disorder, or entropy, will always increase — a little here and there, in everything that happens, until the Universe ends as one big, hot mess. While causal asymmetry is not the same as the thermodynamic arrow, they could be interrelated. Classical models that track more information also generate more disorder. “This hints that causal asymmetry can have entropic consequence.”

The results may also have practical value. Doing away with the classical overhead for reversing cause and effect could help quantum simulation. “Like being played a movie in reverse time, sometimes we may be required to make sense of things that are presented in an order that is intrinsically difficult to model. In such cases, quantum methods could prove vastly more efficient than their classical counterparts.”

Alcohol-related cirrhosis deaths skyrocket in young adults

Posted on


180718183605_1_540x360

Deaths from cirrhosis rose in all but one state between 1999-2016, with increases seen most often among young adults.

The deaths linked to the end stages of liver damage jumped by 65 percent with alcohol a major cause, adults age 25-34 the biggest victims and fatalities highest among whites, American Indians and Hispanics.

Liver specialist Elliot B. Tapper, M.D., says he’s witnessed the disturbing shift in demographics among the patients with liver failure he treats at Michigan Medicine. confirms that in communities across the country more young people are drinking themselves to death.

The data shows adults age 25-34 experienced the highest average annual increase in cirrhosis deaths — about 10.5 percent each year. The rise was driven entirely by alcohol-related liver disease, the authors say.

“Each alcohol-related death means decades of lost life, broken families and lost economic productivity,” says Tapper, a member of the University of Michigan Division of Gastroenterology and Hepatology and health services researcher at the U-M Institute for Healthcare Policy and Innovation.

“In addition, medical care of those dying from cirrhosis costs billions of dollars.”

The rise in liver deaths is not where liver specialists expected to be after gains in fighting hepatitis C, a major liver threat seen often in Baby Boomers. Antiviral medications have set the course to one day eradicate hepatitis C.

Cirrhosis can be caused by a virus like hepatitis C, fatty liver disease or alcohol abuse. The increase in liver deaths highlights new challenges in preventing cirrhosis deaths beyond hepatitis.

“We thought we would see improvements, but these data make it clear: even after hepatitis C, we will still have our work cut out for us,” says Tapper.

That mortality due to cirrhosis began increasing in 2009 — around the time of the Great Recession when the economic downturn led to loss of people’s savings, homes and jobs — may offer a clue as to its cause.

“We suspect that there is a connection between increased alcohol use and unemployment associated with the global financial crisis. But more research is needed,” Tapper says.

Cirrhosis caused a total of 460,760 deaths during the seven-year study period; about one-third were attributed to hepatocellular carcinoma, a common type of liver cancer that is often caused by cirrhosis, researchers found.

In 2016 alone, 11,073 lives were lost to liver cancer which was doubled the number of deaths in 1999.

Researchers studied the trends in liver deaths due to cirrhosis by examining death certificates compiled by the Centers for Disease Control and Prevention’s Wide-ranging Online Data for Epidemiologic Research project.

“The rapid rise in liver deaths underscores gaps in care and opportunities for prevention,” says Parikh, study co-author and liver specialist at Michigan Medicine.

The study’s goal was to determine trends in liver disease deaths and which groups have been impacted most across the country. The research showed:

  • Fewer Asians and Pacific Islanders died of liver cancer.
  • It is hitting many places especially hard, namely Kentucky, Alabama, Arkansas and New Mexico, where cirrhosis deaths were highest.
  • A state-by-state analysis showed cirrhosis mortality is improving only in Maryland.

Deaths due to alcohol-related liver disease are entirely preventable, say authors who suggest strategies such as taxes on alcohol, minimum prices for alcohol and reducing marketing and advertising to curb problem drinking. Higher alcohol costs have been linked with decreased alcohol-related deaths.