technology

Einstein’s general relativity confirmed near black hole

Posted on


180730090158_1_540x360.jpg

Obscured by thick clouds of absorbing dust, the closest supermassive black hole to the Earth lies 26,000 light years away at the centre of the Milky Way. This gravity monster, which has a mass four million times that of the Sun, is surrounded by a small group of stars orbiting at high speed. This extreme environment — the strongest gravitational field in our galaxy — makes it the perfect place to test gravitational physics, particularly Einstein’s general theory of relativity.

New infrared observations from the exquisitely sensitive GRAVITY, NACO and SINFONI instruments on ESO’s Very Large Telescope (VLT) have now allowed astronomers to follow one of these stars, called S2, as it passed very close to the black hole during May 2018 at a speed in excess of 25 million kilometres per hour — three percent of the speed of light — and at a distance of less than 20 billion kilometres.

These extremely delicate measurements were made by an international team led by Reinhard Genzel of the Max Planck Institute for extraterrestrial physics (MPE) in Garching, Germany, in conjunction with collaborators around the world. The observations form the culmination of a 26-year series of ever more precise observations of the centre of the Milky Way using ESO instruments. ‘This is the second time that we have observed the close passage of S2 around the black hole in our galactic centre. But this time, because of much improved instrumentation, we were able to observe the star with unprecedented resolution’, explains Genzel. ‘We have been preparing intensely for this event over several years, as we wanted to make the most of this unique opportunity to observe general relativistic effects.’

The new measurements clearly reveal an effect called gravitational redshift. Light from the star is stretched to longer wavelengths by the very strong gravitational field of the black hole. And the stretch in wavelength of light from S2 agrees precisely with that predicted by Einstein’s theory of general relativity. This is the first time that this deviation from the predictions of simpler Newtonian gravity has been observed in the motion of a star around a supermassive black hole. The team used SINFONI to measure the motion of S2 towards and away from Earth and the GRAVITY interferometric instrument to make extraordinarily precise measurements of the position of S2 in order to define the shape of its orbit. GRAVITY creates such sharp images that it can reveal the motion of the star from night to night as it passes close to the black hole — 26,000 light years from Earth.

‘Our first observations of S2, about two years ago, already showed that we would have the ideal black hole laboratory’, adds Frank Eisenhauer (MPE), Co-Principal Investigator of the GRAVITY instrument. ‘During the close passage, we managed not only to precisely follow the star on its orbit, we could even detect the faint glow around the black hole on most of the images.’ By combining the position and velocity measurements from SINFONI and GRAVITY, as well as previous observations using other instruments, the team could compare them to the predictions of Newtonian gravity, general relativity and other theories of gravity. As expected, the new results are inconsistent with Newtonian predictions and in excellent agreement with the predictions of general relativity. More than one hundred years after he published his paper setting out the equations of general relativity, Einstein has been proved right once more.

The hardware contribution of the Institute of Physics I of the University of Cologne was the development and construction of the two spectrometers of GRAVITY. The spectrometers analyse the wavelength of the observed stellar light and convert the received photons into electronic signals. ‘GRAVITY is a technological challenge. However, after more than two decades of astrophysical research on the high velocity stars in the Galactic Centre and on the development of astronomical instrumentation, the effort has been rewarded with an excellent result in experimental physics’, says Andreas Eckhart from the University of Cologne.

Continuing observations are expected to reveal another relativistic effect later in the year a small rotation of the star’s orbit, known as Schwarzschild precession  as S2 moves away from the black hole.

Plate tectonics not needed to sustain life

Posted on


180730172814_1_540x360.jpg

There may be more habitable planets in the universe than we previously thought, who suggest that plate tectonics — long assumed to be a requirement for suitable conditions for life — are in fact not necessary.

When searching for habitable planets or life on other planets, scientists look for biosignatures of atmospheric carbon dioxide. On Earth, atmospheric carbon dioxide increases surface heat through the greenhouse effect. Carbon also cycles to the subsurface and back to the atmosphere through natural processes.

“Volcanism releases gases into the atmosphere, and then through weathering, carbon dioxide is pulled from the atmosphere and sequestered into surface rocks and sediment,” said Bradford Foley, assistant professor of geosciences. “Balancing those two processes keeps carbon dioxide at a certain level in the atmosphere, which is really important for whether the climate stays temperate and suitable for life.”

Most of Earth’s volcanoes are found at the border of tectonic plates, which is one reason scientists believed they were necessary for life. Subduction, in which one plate is pushed deeper into the subsurface by a colliding plate, can also aid in carbon cycling by pushing carbon into the mantle.

Planets without tectonic plates are known as stagnant lid planets. On these planets, the crust is one giant, spherical plate floating on mantle, rather than separate pieces. These are thought to be more widespread than planets with plate tectonics. In fact, Earth is the only planet with confirmed tectonic plates.

Foley and Andrew Smye, assistant professor of geosciences, created a computer model of the lifecycle of a planet. They looked at how much heat its climate could retain based on its initial heat budget, or the amount of heat and heat-producing elements present when a planet forms. Some elements produce heat when they decay. On Earth, decaying uranium produces thorium and heat, and decaying thorium produces potassium and heat.

After running hundreds of simulations to vary a planet’s size and chemical composition, the researchers found that stagnant lid planets can sustain conditions for liquid water for billions of years. At the highest extreme, they could sustain life for up to 4 billion years, roughly Earth’s life span to date.

“You still have volcanism on stagnant lid planets, but it’s much shorter lived than on planets with plate tectonics because there isn’t as much cycling,” said Smye. “Volcanoes result in a succession of lava flows, which are buried like layers of a cake over time. Rocks and sediment heat up more the deeper they are buried.”

The researchers found that at high enough heat and pressure, carbon dioxide gas can escape from rocks and make its way to the surface, a process known as degassing. On Earth, Smye said, the same process occurs with water in subduction fault zones.

This degassing process increases based on what types and quantities of heat-producing elements are present in a planet up to a certain point, said Foley.

“There’s a sweet spot range where a planet is releasing enough carbon dioxide to keep the planet from freezing over, but not so much that the weathering can’t pull carbon dioxide out of the atmosphere and keep the climate temperate,” he said.

According to the researchers’ model, the presence and amount of heat-producing elements were far better indicators for a planet’s potential to sustain life.

“One interesting take-home point of this study is that the initial composition or size of a planet is important in setting the trajectory for habitability.” “The future fate of a planet is set from the outset of its birth.”

Memory-processing unit could bring memristors to the masses

Posted on


180730120356_1_540x360

A new way of arranging advanced computer components called memristors on a chip could enable them to be used for general computing, which could cut energy consumption by a factor of 100.

This would improve performance in low power environments such as smartphones or make for more efficient supercomputers.

“Historically, the semiconductor industry has improved performance by making devices faster. But although the processors and memories are very fast, they can’t be efficient because they have to wait for data to come in and out,” said Wei Lu, U-M professor of electrical and computer engineering and co-founder of memristor startup Crossbar Inc.

Memristors might be the answer. Named as a portmanteau of memory and resistor, they can be programmed to have different resistance states — meaning they store information as resistance levels. These circuit elements enable memory and processing in the same device, cutting out the data transfer bottleneck experienced by conventional computers in which the memory is separate from the processor.

However, unlike ordinary bits, which are 1 or 0, memristors can have resistances that are on a continuum. Some applications, such as computing that mimics the brain (neuromorphic), take advantage of the analog nature of memristors. But for ordinary computing, trying to differentiate among small variations in the current passing through a memristor device is not precise enough for numerical calculations.

Lu and his colleagues got around this problem by digitizing the current outputs — defining current ranges as specific bit values (i.e., 0 or 1). The team was also able to map large mathematical problems into smaller blocks within the array, improving the efficiency and flexibility of the system.

Computers with these new blocks, which the researchers call “memory-processing units,” could be particularly useful for implementing machine learning and artificial intelligence algorithms. They are also well suited to tasks that are based on matrix operations, such as simulations used for weather prediction. The simplest mathematical matrices, akin to tables with rows and columns of numbers, can map directly onto the grid of memristors.

Once the memristors are set to represent the numbers, operations that multiply and sum the rows and columns can be taken care of simultaneously, with a set of voltage pulses along the rows. The current measured at the end of each column contains the answers. A typical processor, in contrast, would have to read the value from each cell of the matrix, perform multiplication, and then sum up each column in series.

“We get the multiplication and addition in one step. It’s taken care of through physical laws. We don’t need to manually multiply and sum in a processor,” Lu said.

His team chose to solve partial differential equations as a test for a 32×32 memristor array — which Lu imagines as just one block of a future system. These equations, including those behind weather forecasting, underpin many problems science and engineering but are very challenging to solve. The difficulty comes from the complicated forms and multiple variables needed to model physical phenomena.

When solving partial differential equations exactly is impossible, solving them approximately can require supercomputers. These problems often involve very large matrices of data, so the memory-processor communication bottleneck is neatly solved with a memristor array. such as those used for integrated circuit fabrication.

Tickborne diseases are likely to increase, say NIAID officials

Posted on


180726085830_1_540x360.jpg

The incidence of tickborne infections in the United States has risen significantly within the past decade. It is imperative, therefore, that public health officials and scientists build a robust understanding of pathogenesis, design improved diagnostics, and develop preventive vaccines.

Bacteria cause most tickborne diseases in the United States, with Lyme disease representing the majority (82 percent) of reported cases. The spirochete Borrelia burgdorferi is the primary cause of Lyme disease in North America; it is carried by hard-bodied ticks that then feed on smaller mammals, such as white-footed mice, and larger animals, such as white-tailed deer. Although there are likely many factors contributing to increased Lyme disease incidence in the U.S., greater tick densities and their expanding geographical range have played a key role, the authors write. For example, the Ixodes scapularis tick, which is the primary source of Lyme disease in the northeastern U.S., had been detected in nearly 50 percent more counties by 2015 than was previously reported in 1996. Although most cases of Lyme disease are successfully treated with antibiotics, 10 to 20 percent of patients report lingering symptoms after effective antimicrobial therapy. Scientists need to better understand this lingering morbidity, note the authors.

Tickborne virus infections are also increasing and could cause serious illness and death. For example, Powassan virus (POWV), recognized in 1958, causes a febrile illness that can be followed by progressive and severe neurologic conditions, resulting in death in 10 to 15 percent of cases and long-term symptoms in as many as 70 percent of survivors. Only 20 U.S. cases of POWV infection were reported before 2006; 99 cases were reported between 2006 and 2016.

The public health burden of tickborne disease is considerably underreported, according to the authors. For example, the U.S. Centers for Disease Control and Prevention (CDC) reports approximately 30,000 cases of Lyme disease annually in the U.S. but estimates that the true incidence is 10 times that number. According to the authors, this is due in part to the limitations of current tickborne disease surveillance, as well as current diagnostics, which may be imprecise in some cases and are unable to recognize new tickborne pathogens as they emerge. These limitations have led researchers to explore new, innovative diagnostics with different platforms that may provide clinical benefit in the future.

It is also critical that scientists develop vaccines to prevent disease, the authors write. A vaccine to protect against Lyme disease was previously developed, but was pulled from the market and is no longer available. Future protective measures could include vaccines specifically designed to create an immune response to a pathogen, or to target pathogens inside the ticks that carry them.

By focusing research on the epidemiology of tickborne diseases, improving diagnostics, finding new treatments and developing preventive vaccines, public health officials and researchers may be able to stem the growing threat these diseases pose. In the meantime, the authors suggest, healthcare providers should advise their patients to use basic prevention techniques: wear insect repellant, wear long pants when walking in the woods or working outdoors, and check for ticks.

Glaciers in East Antarctica also ‘imperiled’ by climate change

Posted on


180726161009_1_540x360

A team of scientists from the University of California, Irvine has found evidence of significant mass loss in East Antarctica’s Totten and Moscow University glaciers, which, if they fully collapsed, could add 5 meters (16.4 feet) to the global sea level.

The glaciologists estimate that between April 2002 and September 2016, the two glaciers lost about 18.5 billion tons of ice per year — equivalent to 0.7 millimeters (0.03 inches) of global sea level rise over the analyzed time period.

UCI’s researchers discovered this by applying a locally optimized technique to data from NASA’s Gravity Recovery & Climate Experiment satellite mission, combined with mass balance approximations from regional atmospheric climate models and ice discharge measurements by NASA’s Operation IceBridge and Measures projects.

“For this research, we used an improved methodology with GRACE data to retrieve the mass loss in an area undergoing rapid change,” a graduate student in UCI’s Department of Earth System Science. “By overlaying these data with independent measurements, we improve our confidence in the results and the conclusion that Totten and Moscow University are imperiled.”

Making up roughly two-thirds of the Antarctic continent, East Antarctica has been viewed by polar researchers as less threatened by climate change than the volatile ice sheets in West Antarctica and the Antarctic Peninsula.

“Both of these glaciers are vulnerable to the intrusion of warm ocean water and hold considerable potential for sea level rise,” said co-author Eric Rignot, Donald Bren Professor and chair of Earth system science at UCI. “This work highlights that East Antarctic glaciers are as important to our future as those in the continent’s western regions.”

According to co-author Isabella Velicogna, professor of Earth system science, it’s challenging to study the Totten and Moscow University glaciers because the signal of change is much weaker than that of their counterparts in the west.

“In this remote part of the world, the data from GRACE and other satellite missions are critical for us to understand the glacier evolution.”

Extinct vegetarian cave bear diet mystery unravelled

Posted on


180728083510_1_540x360

During the Late Pleistocene period (between 125,000 to 12,000 years ago) two bear species roamed Europe: omnivorous brown bears  and the extinct mostly vegetarian cave bear.

Until now, very little is known about the dietary evolution of the cave bear and how it became a vegetarian, as the fossils of the direct ancestor, the Deninger’s bear , are extremely scarce.

However,  sheds new light on this. A research team from Germany and Spain found that Deninger’s bear likely had a similar diet to its descendant  the classic cave bear  as new analysis shows a distinct morphology in the cranium, mandible and teeth, which has been related to its dietary specialization of a larger consumption of vegetal matter.

To understand the evolution of the cave bear lineage, the researchers micro-CT scanned the rare fossils and digitally removed the sediments so as not to risk damaging the fossils. Using sophisticated statistical methods, called geometric morphometrics, the researchers compared the three-dimensional shape of the mandibles and skull of Deninger’s bear with that of classic cave bears and modern bears.

“The analyses showed that Deninger’s bear had very similarly shaped mandibles and skull to the classic cave bear,”  This implies that they were adapted to the same food types and were primarily vegetarian.

“There is an ongoing discussion on the extent to which the classic cave bear was a vegetarian. And, this is especially why the new information on the diet of its direct ancestor is so important, because it teaches us that a differentiation between the diet of cave bears and brown bears was already established by 500 thousand years ago and likely earlier,” doctoral candidate at the Universities of the Basque Country and Bordeaux and co-author of the study.

Interestingly, researchers also found there are shape differences between the Deninger’s bears from the Iberian Peninsula and those from the rest of Europe, which are unlikely to be related to diet.

They have come up with three possibilities to explain these differences: 1) the Iberian bears are chronologically younger than the rest, 2) the Pyrenees, acting as natural barrier, resulted in some genetic differentiation between the Iberian bears and those from the rest of Europe, 3) there were multiple lineages, with either just one leading to the classic cave bear, or each lineage leading to a different group of cave bears.

 

Galaxy outskirts likely hunting grounds for dying massive stars and black holes

Posted on


180726160951_1_540x360

Findings from a Rochester Institute of Technology study provide further evidence that the outskirts of spiral galaxies host massive black holes. These overlooked regions are new places to observe gravitational waves created when the massive bodies collide.

The study winds back time on massive black holes by analyzing their visible precursors — supernovae with collapsing cores. The slow decay of these massive stars creates bright signatures in the electromagnetic spectrum before stellar evolution ends in black holes.

Using data from the Lick Observatory Supernova Search, a survey of nearby galaxies, the team compared the supernovae rate in outer spiral galaxies with that of known hosts — dwarf/satellite galaxies — and found comparable numbers for typical spiral outskirts and typical dwarf galaxies, roughly two core-collapse supernovae per millennium.

Low levels of elements heavier than hydrogen and helium found in dwarf/satellite galaxies create favorable conditions for massive black holes to form and create binary pairs. A similar galactic environment in the outer disks of spiral galaxies also creates likely hunting grounds for massive black holes, said Sukanya Chakrabarti, lead author and assistant professor in the RIT School of Physics and Astronomy.

“If these core-collapse supernovae are the predecessors to the binary black holes detected by LIGO (Laser Interferometer Gravitational-wave Observatory), then what we’ve found is a reliable method of identifying the host galaxies of LIGO sources,” said Chakrabarti. “Because these black holes have an electromagnetic counterpart at an earlier stage in their life, we can pinpoint their location in the sky and watch for massive black holes.”

The study’s findings complement Chakrabarti’s 2017 study, which showed that the outer parts of spiral galaxies could contribute to LIGO detection rates. The regions form stars at a comparable rate to dwarf galaxies and are low in heavy element content, creating a conducive home for massive black holes. The current study isolates potential candidates within these favorable galactic environments.

“We see now that these are both important contributors.” “The next step is to do deeper surveys to see if we can improve the rate.”

“This work may help us determine which galaxies to be on the lookout for electromagnetic counterparts of massive black holes.”

Thin gap on stellar family portrait

Posted on


180726161004_1_540x360

A thin gap has been discovered on the Hertzsprung-Russell Diagram (HRD), the most fundamental of all maps in stellar astronomy, a finding that provides new information about the interior structures of low mass stars in the Milky Way Galaxy.

Just as a graph can be made of people with different heights and weights, astronomers compare stars using their luminosities and temperatures. The HRD is a “family portrait” of the stars in the Galaxy, where stars such as the Sun, Altair, Alpha Centauri, Betelgeuse, the north star Polaris and Sirius can be compared. The newly discovered gap cuts diagonally across the HRD and indicates where a crucial internal change occurs in the structures of stars. The gap outlines where stars transition from being larger and mostly convective with a thin radiative layer to being smaller and fully convective.

Radiation and convection are two ways to transfer energy from inside a star to its surface. Radiation transfers energy through space, and convection is the transfer of energy from one place to another by the movement of fluid.

The researchers estimate that stars above the gap contain more than about one-third the mass of the Sun, and those below have less mass. Because different types of stars have different masses, this feature reveals where different types of interior structures are on the HRD. The gap occurs in the middle of the region of “red dwarf” stars, which are much smaller and cooler than the Sun, but compose three of every four stars in the solar neighborhood.

“We were pretty excited to see this result, and it provides us new insights to the structures and evolution of stars,” first author of the study and a staff astronomer in the Department of Physics and Astronomy at Georgia State.

In 2013, the European Space Agency (ESA) launched the Gaia spacecraft to make a census of the stars in the Milky Way Galaxy and to create a three-dimensional map. In April 2018, the ESA released results of this mission, revealing an unprecedented map of more than one billion stars in the Galaxy, a 10,000-fold increase in the number of stars with accurate distances. The research team led by Georgia State plotted nearly 250,000 of the closest stars in the Gaia data on the HRD to reveal the gap. Georgia State’s researchers have studied the distances to nearby stars for years, which enabled them to interpret the results and notice this thin gap.

Using results from a theoretical computer model that simulates the activity inside the stars, it appears the gap is caused by a slight shrinking in size if a star is convective all the way through.

X-ray technology reveals never-before-seen matter around black hole

Posted on


180728084103_1_540x360.jpg

In an international collaboration between Japan and Sweden, scientists clarified how gravity affects the shape of matter near the black hole in binary system Cygnus X-1. may help scientists further understand the physics of strong gravity and the evolution of black holes and galaxies.

Near the center of the constellation of Cygnus is a star orbiting the first black hole discovered in the universe. Together, they form a binary system known as Cygnus X-1. This black hole is also one of the brightest sources of X-rays in the sky. However, the geometry of matter that gives rise to this light was uncertain. The research team revealed this information from a new technique called X-ray polarimetry.

Taking a picture of a black hole is not easy. For one thing, it is not yet possible to observe a black hole because light cannot escape it. Rather, instead of observing the black hole itself, scientists can observe light coming from matter close to the black hole. In the case of Cygnus X-1, this matter comes from the star that closely orbits the black hole.

Most light that we see, like from the sun, vibrates in many directions. Polarization filters light so that it vibrates in one direction. It is how snow goggles with polarized lenses let skiers see more easily where they are going down the mountain — they work because the filter cuts light reflecting off of the snow.

“It’s the same situation with hard X-rays around a black hole,” Hiroshima University Assistant Professor and study coauthor Hiromitsu Takahashi said. “However, hard X-rays and gamma rays coming from near the black hole penetrate this filter. There are no such ‘goggles’ for these rays, so we need another special kind of treatment to direct and measure this scattering of light.”

The team needed to figure out where the light was coming from and where it scattered. In order to make both of these measurements, they launched an X-ray polarimeter on a balloon called PoGO+. From there, the team could piece together what fraction of hard X-rays reflected off the accretion disk and identify the matter shape.

Two competing models describe how matter near a black hole can look in a binary system such as Cygnus X-1: the lamp-post and extended model. In the lamp-post model, the corona is compact and bound closely to the black hole. Photons bend toward the accretion disk, resulting in more reflected light. In the extended model, the corona is larger and spread around the vicinity of the black hole. In this case, the reflected light by the disk is weaker.

Since light did not bend that much under the strong gravity of the black hole, the team concluded that the black hole fit the extended corona model.

With this information, the researchers can uncover more characteristics about black holes. One example is its spin. The effects of spin can modify the space-time surrounding the black hole. Spin could also provide clues into the evolution of the black hole. It could be slowing down in speed since the beginning of the universe, or it could be accumulating matter and spinning faster.

“The black hole in Cygnus is one of many.”  “We would like to study more black holes using X-ray polarimetry, like those closer to the center of galaxies. Maybe we better understand black hole evolution, as well as galaxy evolution.”

Optical neural network demo

Posted on


180728084146_1_540x360.jpg

Researchers at the National Institute of Standards and Technology (NIST) have made a silicon chip that distributes optical signals precisely across a miniature brain-like grid, showcasing a potential new design for neural networks.

The human brain has billions of neurons (nerve cells), each with thousands of connections to other neurons. Many computing research projects aim to emulate the brain by creating circuits of artificial neural networks. But conventional electronics, including the electrical wiring of semiconductor circuits, often impedes the extremely complex routing required for useful neural networks.

The NIST team proposes to use light instead of electricity as a signaling medium. Neural networks already have demonstrated remarkable power in solving complex problems, including rapid pattern recognition and data analysis. The use of light would eliminate interference due to electrical charge and the signals would travel faster and farther.

“Light’s advantages could improve the performance of neural nets for scientific data analysis such as searches for Earth-like planets and quantum information science, and accelerate the development of highly intuitive control systems for autonomous vehicles.”

A conventional computer processes information through algorithms, or human-coded rules. a neural network relies on a network of connections among processing elements, or neurons, which can be trained to recognize certain patterns of stimuli. A neural or neuromorphic computer would consist of a large, complex system of neural networks.

Described in a new paper, the NIST chip overcomes a major challenge to the use of light signals by vertically stacking two layers of photonic waveguides — structures that confine light into narrow lines for routing optical signals, much as wires route electrical signals. This three-dimensional (3D) design enables complex routing schemes, which are necessary to mimic neural systems. Furthermore, this design can easily be extended to incorporate additional waveguiding layers when needed for more complex networks.

The stacked waveguides form a three-dimensional grid with 10 inputs or “upstream” neurons each connecting to 10 outputs or “downstream” neurons, for a total of 100 receivers. Fabricated on a silicon wafer, the waveguides are made of silicon nitride and are each 800 nanometers (nm) wide and 400 nm thick. Researchers created software to automatically generate signal routing, with adjustable levels of connectivity between the neurons.

Laser light was directed into the chip through an optical fiber. The goal was to route each input to every output group, following a selected distribution pattern for light intensity or power. Power levels represent the pattern and degree of connectivity in the circuit. The authors demonstrated two schemes for controlling output intensity: uniform (each output receives the same power) and a “bell curve” distribution (in which middle neurons receive the most power, while peripheral neurons receive less).

To evaluate the results, researchers made images of the output signals. All signals were focused through a microscope lens onto a semiconductor sensor and processed into image frames. This method allows many devices to be analyzed at the same time with high precision. The output was highly uniform, with low error rates, confirming precise power distribution.

“We’ve really done two things here,” Chiles said. “We’ve begun to use the third dimension to enable more optical connectivity, and we’ve developed a new measurement technique to rapidly characterize many devices in a photonic system. Both advances are crucial as we begin to scale up to massive optoelectronic neural systems.”