Why are some people better able to fight off the flu than others? Part of the answer, according to a new study, is related to the first flu strain we encounter in childhood.
Scientists from UCLA and the University of Arizona have found that people’s ability to fight off the flu virus is determined not only by the subtypes of flu they have had throughout their lives, but also by the sequence in which they are been infected by the viruses. Their study is published in the open-access journal PLoS Pathogens.
The research offers an explanation for why some people fare much worse than others when infected with the same strain of the flu virus, and the findings could help inform strategies for minimizing the effects of the seasonal flu.
In addition, UCLA scientists, including Professor James Lloyd-Smith, who also was a senior author of the PLoS Pathogens research, recently completed a study that analyzes travel-related screening for the new novel coronavirus 2019-nCoV.
The researchers report that screening travelers is not very effective for the 2019 coronavirus — that it will catch less than half of infected travelers, on average — and that most infected travelers are undetectable, meaning that they have no symptoms yet, and are unaware that they have been exposed. So stopping the spread of the virus is not a matter of just enhancing screening methods at airports and other travel hubs.
“This puts the onus on government officials and public health officials to follow up with travelers after they arrive, to isolate them and trace their contacts if they get sick later,” said Lloyd-Smith, a UCLA professor of ecology and evolutionary biology. Many governments have started to impose quarantines, or even travel bans, as they realize that screening is not sufficient to stop the spread of the coronavirus.
One major concern, Lloyd-Smith said, is that other countries, especially developing nations, lack the infrastructure and resources for those measures, and are therefore vulnerable to importing the disease.
“Much of the public health world is very concerned about the virus being introduced into Africa or India, where large populations exist do not have access to advanced medical care,” he said.
The researchers, including scientists from the University of Chicago and the London School of Tropical Hygiene and Medicine, have developed a free online app where people can calculate the effectiveness of travel screening based on a range of parameters.
Solving a decades-old question
The PLoS Pathogens study may help solve a problem that had for decades vexed scientists and health care professionals: why the same strain of the flu virus affects people with various degrees of severity.
A team that included some of the same UCLA and Arizona scientists reported in 2016 that exposure to influenza viruses during childhood gives people partial protection for the rest of their lives against distantly related influenza viruses. Biologists call the idea that past exposure to the flu virus determines a person’s future response to infections “immunological imprinting.”
The 2016 research helped overturn a commonly held belief that previous exposure to a flu virus conferred little or no immunological protection against strains that can jump from animals into humans, such as those causing the strains known as swine flu or bird flu. Those strains, which have caused hundreds of spillover cases of severe illness and death in humans, are of global concern because they could gain mutations that allow them to readily jump not only from animal populations to humans, but also to spread rapidly from person to person.
In the new study, the researchers investigated whether immunological imprinting could explain people’s response to flu strains already circulating in the human population and to what extent it could account for observed discrepancies in how severely the seasonal flu affects people in different age groups.
To track how different strains of the flu virus affect people at different ages, the team analyzed health records that the Arizona Department of Health Services obtains from hospitals and private physicians.
Two subtypes of influenza virus, H3N2 and H1N1, have been responsible for seasonal outbreaks of the flu over the past several decades. H3N2 causes the majority of severe cases in high-risk elderly people and the majority of deaths from the flu. H1N1 is more likely to affect young and middle-aged adults, and causes fewer deaths.
The health record data revealed a pattern: People first exposed to the less severe strain, H1N1, during childhood were less likely to end up hospitalized if they encountered H1N1 again later in life than people who were first exposed to H3N2. And people first exposed to H3N2 received extra protection against H3N2 later in life.
The researchers also analyzed the evolutionary relationships between the flu strains. H1N1 and H3N2, they learned, belong to two separate branches on the influenza “family tree,” said James Lloyd-Smith, a UCLA professor of ecology and evolutionary biology and one of the study’s senior authors. While infection with one does result in the immune system being better prepared to fight a future infection from the other, protection against future infections is much stronger when one is exposed to strains from the same group one has battled before, he said.
The records also revealed another pattern: People whose first childhood exposure was to H2N2, a close cousin of H1N1, did not have a protective advantage when they later encountered H1N1. That phenomenon was much more difficult to explain, because the two subtypes are in the same group, and the researchers’ earlier work showed that exposure to one can, in some cases, grant considerable protection against the other.
“Our immune system often struggles to recognize and defend against closely related strains of seasonal flu, even though these are essentially the genetic sisters and brothers of strains that circulated just a few years ago,” said lead author Katelyn Gostic, who was a UCLA doctoral student in Lloyd-Smith’s laboratory when the study was conducted and is now a postdoctoral fellow at the University of Chicago. “This is perplexing because our research on bird flu shows that deep in our immune memory, we have some ability to recognize and defend against the distantly related, genetic third cousins of the strains we saw as children.
“We hope that by studying differences in immunity against bird flus — where our immune system shows a natural ability to deploy broadly effective protection — and against seasonal flus — where our immune system seems to have bigger blind spots — we can uncover clues useful to universal influenza vaccine development.”
Around the world, influenza remains a major killer. The past two flu seasons have been more severe than expected, said Michael Worobey, a co-author of the study and head of the University of Arizona’s department of ecology and evolutionary biology. In the 2017-18 season, 80,000 people died in the U.S., more than in the swine flu pandemic of 2009, he said.
People who had their first bout of flu as children in 1955 — when the H1N1 was circulating but the H3N2 virus was not — were much more likely to be hospitalized with an H3N2 infection than an H1N1 infection last year, when both strains were circulating, Worobey said.
“The second subtype you’re exposed to is not able to create an immune response that is as protective and durable as the first,” he said.
The researchers hope that their findings could help predict which age groups might be severely affected during future flu seasons based on the subtype circulating. That information could also help health officials prepare their response, including decisions about who should receive certain vaccines that are only available in limited quantities.
The research was funded by the National Institutes of Health, the National Science Foundation, DARPA and the David and Lucile Packard Foundation. In 2018, the NIH’s National Institute of Allergy and Infectious Diseases announced a strategic plan to develop a universal flu vaccine.
The study’s co-authors are Rebecca Bridge of the Arizona Department of Health Services and Cecile Viboud of the Fogarty International Center at the NIH.
- Katelyn M. Gostic, Rebecca Bridge, Shane Brady, Cécile Viboud, Michael Worobey, James O. Lloyd-Smith. Childhood immune imprinting to influenza A shapes birth year-specific risk during seasonal H1N1 and H3N2 epidemics. PLOS Pathogens, 2019; 15 (12): e1008109 DOI: 10.1371/journal.ppat.1008109
Causes of cancer are being catalogued by a huge international study revealing the genetic fingerprints of DNA-damaging processes that drive cancer development. Researchers from the Wellcome Sanger Institute, Duke-NUS Medical School Singapore, University of California San Diego School of Medicine, the Broad Institute of MIT and Harvard and their collaborators around the world have achieved the most detailed list of these genetic fingerprints to date, providing clues as to how each cancer developed.
These fingerprints will allow scientists to search for previously unknown chemicals, biological pathways and environmental agents responsible for causing cancer.
The research, published in Nature today (5th February) as part of a global Pan-Cancer Project, will help understand the causes of cancer, informing prevention strategies, and help signpost new directions for cancer diagnosis and treatments.
Also published today in Nature and related journals, are 22 further studies from the Pan-Cancer Project. The collaboration involving more than 1,300 scientists and clinicians from 37 countries, analysed more than 2,600 genomes of 38 different tumour types. The project represents an unprecedented international exploration of cancer genomes, which significantly improves our fundamental understanding of cancer and zeros-in on mechanisms of cancer development.
In the UK, someone is diagnosed with cancer every two minutes, with 363,000 new cancer cases every year. The disease causes around 165,000 deaths in the UK annually.
Cancer is caused by genetic changes — mutations — in the DNA of a cell, allowing the cell to divide uncontrollably. Many known causes of cancer, such as UV light and tobacco smoking, leave a specific fingerprint of damage in the DNA, known as a mutational signature. These fingerprints can help understand how cancers develop, and potentially, how they can be prevented. However, past studies have not been large enough to identify all potential mutational signatures.
The fingerprint study identified new mutational signatures that had not been seen before, from single letter ‘typo’ mutations, to slightly larger insertions and deletions of genetic code. The result is the largest database of reference mutational signatures ever. Only about half of all the mutational signatures have known causes, however this resource can now be used to help find more of these causes and better understand cancer development.
Professor Steven Rozen, a senior author from Duke-NUS Medical School, Singapore, said: “Some types of these DNA fingerprints, or mutational signatures, reflect how the cancer could respond to drugs. Further research into this could help to diagnose some cancers and what drugs they might respond to.”
Professor Gad Getz, a senior author from the Broad Institute of MIT and Harvard, and Massachusetts General Hospital, said, “The availability of a large number of whole genomes enabled us to apply more advanced analytical methods to discover and refine mutational signatures and expand our study into additional types of mutations. Our new collection of signatures provides a more complete picture of biological and chemical processes that damage or repair DNA and will enable researchers to decipher the mutational processes that affect the genomes of newly sequenced cancers.”
Another study in the Pan-Cancer Project, published in Nature today, discovered that larger, more complex genetic changes that rearrange the DNA could also act as mutational signatures, and point towards causes of cancer. Researchers from the Wellcome Sanger Institute and the Broad Institute of MIT and Harvard and their collaborators found 16 of these signatures that spanned from rearrangements of single genes to entire chromosomes.
The global Pan-Cancer Project is the largest and most comprehensive study of whole cancer genomes yet. The collaboration has created a huge resource of primary cancer genomes, available to researchers worldwide to advance cancer research.
- Ludmil B. Alexandrov, Jaegil Kim, Nicholas J. Haradhvala, Mi Ni Huang, Alvin Wei Tian Ng, Yang Wu, Arnoud Boot, Kyle R. Covington, Dmitry A. Gordenin, Erik N. Bergstrom, S. M. Ashiqul Islam, Nuria Lopez-Bigas, Leszek J. Klimczak, John R. McPherson, Sandro Morganella, Radhakrishnan Sabarinathan, David A. Wheeler, Ville Mustonen, Gad Getz, Steven G. Rozen, Michael R. Stratton. The repertoire of mutational signatures in human cancer. Nature, 2020; 578 (7793): 94 DOI: 10.1038/s41586-020-1943-3
- Yilong Li, Nicola D. Roberts, Jeremiah A. Wala, Ofer Shapira, Steven E. Schumacher, Kiran Kumar, Ekta Khurana, Sebastian Waszak, Jan O. Korbel, James E. Haber, Marcin Imielinski, Joachim Weischenfeldt, Rameen Beroukhim, Peter J. Campbell. Patterns of somatic structural variation in human cancer genomes. Nature, 2020; 578 (7793): 112 DOI: 10.1038/s41586-019-1913-9
- The ICGC/TCGA Pan-Cancer Analysis of Whole Genomes Consortium. Pan-cancer analysis of whole genomes. Nature, 2020 DOI: 10.1038/s41586-020-1969-6
Scientists at the University of Groningen and the University Medical Center Groningen used molecular motors to manipulate the protein matrix on which bone marrow-derived mesenchymal stem cells are grown. Rotating motors altered the protein structure, which resulted in a bias of the stem cells to differentiate into bone cells (osteoblasts). Without rotation, the stem cells tended to remain multipotent. These results, which could be used in tissue engineering, were published in Science Advances on 29 January.
‘Cells are sensitive to the structure of the surface that they attach to,’ explains Patrick van Rijn, associate professor in Materiobiology and Nanobiomaterials. ‘And movement is an important driver in biology, especially continuous movement.’ That is why Van Rijn and Feringa and their colleagues decided to use molecular motors to manipulate the protein matrix on which stem cells are grown. The light-driven motor molecules were designed by the 2016 Nobel Laureate in Chemistry Ben Feringa.
The scientists linked molecular motors to a glass surface. Subsequently, the surface was coated with protein and either exposed to UV irradiation to power the motors or not exposed to it at all. After about an hour, the motor movement was stopped and cells were seeded onto the protein layer and left to attach. Finally, differentiation factors were added. These experiments showed that cells grown on protein that was submitted to the rotary motion of the molecular motors tended to specialize into bone cells more often, while cells seeded on protein that was not disturbed were more inclined to maintain their stem-cell properties.
Observations of the protein layer using atomic force microscopy and simulations of the interaction between the motor molecules and proteins, performed by Prof. Marrink’s research group, showed that the movement induced subtle structural changes in the protein matrix. ‘The movement of motor molecules interferes with the alpha-helices in the proteins, which causes structural changes,’ explains Van Rijn. He compares it to the difference in texture between an unwhipped egg white and a whipped one.
The change in the surface structure of the adhered protein affects how the cells attach, for example how much they stretch out. This sets off a signaling cascade that eventually leads to altered behavior, such as the differentiation into bone cells. Thus, molecular movement leads to nanoscopic changes in surface structure, which in turn leads to differences in cell attachment, cell morphology and eventually, cell differentiation. ‘It’s like a domino effect, where smaller stones consecutively topple slightly larger ones so that a large effect can be achieved with a small trigger.’
‘Changing the properties of a surface to affect cell fate has been used before,’ says Van Rijn. However, this was done primarily with switches, so there was just a change from one state to another. ‘In our study, we had continuous movement, which is much more in line with the continuous motion found in biological transport and communication systems. The fact that the motors are driven by light is important,’ Van Rijn adds. ‘Light can be carefully controlled in space and time. This would allow us to create complex geometries in the growth matrix, which then result in different properties for the cells.’ Therefore, light-controlled molecular motors could be a useful tool in tissue engineering.
In the future, robots could take blood samples, benefiting patients and healthcare workers alike.
A Rutgers-led team has created a blood-sampling robot that performed as well or better than people, according to the first human clinical trial of an automated blood drawing and testing device.
The device provides quick results and would allow healthcare professionals to spend more time treating patients in hospitals and other settings.
The results, published in the journal Technology, were comparable to or exceeded clinical standards, with an overall success rate of 87% for the 31 participants whose blood was drawn. For the 25 people whose veins were easy to access, the success rate was 97%.
The device includes an ultrasound image-guided robot that draws blood from veins. A fully integrated device, which includes a module that handles samples and a centrifuge-based blood analyzer, could be used at bedsides and in ambulances, emergency rooms, clinics, doctors’ offices and hospitals.
Venipuncture, which involves inserting a needle into a vein to get a blood sample or perform IV therapy, is the world’s most common clinical procedure, with more than 1.4 billion performed daily in the United States. But clinicians fail in 27% of patients without visible veins, 40% of patients without palpable veins and 60% of emaciated patients, according to previous studies.
Repeated failures to start an IV line boost the likelihood of phlebitis, thrombosis and infections, and may require targeting large veins in the body or arteries — at much greater cost and risk. As a result, venipuncture is among the leading causes of injury to patients and clinicians. Moreover, a hard time accessing veins can increase procedure time by up to an hour, requires more staff and costs more than $4 billion a year in the United States, according to estimates.
“A device like ours could help clinicians get blood samples quickly, safely and reliably, preventing unnecessary complications and pain in patients from multiple needle insertion attempts,” said lead author Josh Leipheimer, a biomedical engineering doctoral student in the Yarmush lab in the biomedical engineering department in the School of Engineering at Rutgers University-New Brunswick.
In the future, the device could be used in such procedures as IV catheterization, central venous access, dialysis and placing arterial lines. Next steps include refining the device to improve success rates in patients with difficult veins to access. Data from this study will be used to enhance artificial intelligence in the robot to improve its performance.
Rutgers co-authors include Max L. Balter and Alvin I. Chen, who both graduated with doctorates; Enrique J. Pantin at Rutgers Robert Wood Johnson Medical School; Professor Kristen S. Labazzo; and principal investigator Martin L. Yarmush, the Paul and Mary Monroe Endowed Chair and Distinguished Professor in the Department of Biomedical Engineering. A researcher at Icahn School of Medicine at Mount Sinai Hospital also contributed to the study.
- Josh M. Leipheimer, Max L. Balter, Alvin I. Chen, Enrique J. Pantin, Alexander E. Davidovich, Kristen S. Labazzo, Martin L. Yarmush. First-in-human evaluation of a hand-held automated venipuncture device for rapid venous blood draws. TECHNOLOGY, 2020; 1 DOI: 10.1142/S2339547819500067
Bilingual children use as many words as monolingual children when telling a story, and demonstrate high levels of cognitive flexibility, according to new research by University of Alberta scientists.
“We found that the number of words that bilingual children use in their stories is highly correlated with their cognitive flexibility — the ability to switch between thinking about different concepts,” said Elena Nicoladis, lead author and professor in the Department of Psychology in the Faculty of Science. “This suggests that bilinguals are adept at using the medium of storytelling.”
Vocabulary is a strong predictor of school achievement, and so is storytelling. “These results suggest that parents of bilingual children do not need to be concerned about long-term school achievement, said Nicoladis. “In a storytelling context, bilingual kids are able to use this flexibility to convey stories in creative ways.”
The research examined a group of French-English bilingual children who have been taught two languages since birth, rather than learning a second language later in life. Results show that bilingual children used just as many words to tell a story in English as monolingual children. Participants also used just as many words in French as they did in English when telling a story.
Previous research has shown that bilingual children score lower than monolingual children on traditional vocabulary tests, meaning this results are changing our understanding of multiple languages and cognition in children.
“The past research is not surprising,” added Nicoladis. “Learning a word is related to how much time you spend in each language. For bilingual children, time is split between languages. So, unsurprisingly, they tend to have lower vocabularies in each of their languages. However, this research shows that as a function of storytelling, bilingual children are equally strong as monolingual children.”
This research used a new, highly sensitive measure for examining cognitive flexibility, examining a participant’s ability to switch between games with different rules, while maintaining accuracy and reaction time. This study builds on previous research examining vocabulary in bilingual children who have learned English as a second language.
As the number and technology of humans has grown, their impact on the natural world now equals or exceeds those of natural processes, according to scientists.
Many researchers formally name this period of human-dominance of natural systems as the Anthropocene era, but there is a heated debate over whether this naming should take place and when the period began.
In a co-authored paper published online in the journal Anthropocene, University of Illinois at Chicago paleontologist Roy Plotnick argues that the fossil record of mammals will provide a clear signal of the Anthropocene.
He and Karen Koy of Missouri Western State University report that the number of humans and their animals greatly exceeds that of wild animals.
As an example, in the state of Michigan alone, humans and their animals compose about 96% of the total mass of animals. There are as many chickens as people in the state, and the same should be true in many places in the United States and the world, they say.
“The chance of a wild animal becoming part of the fossil record has become very small,” said Plotnick, UIC professor of earth and environmental sciences and the paper’s lead author. “Instead, the future mammal record will be mostly cows, pigs, sheep, goats, dogs, cats, etc., and people themselves.”
While humans bury most of their dead in cemeteries and have for centuries, their activities have markedly changed how and where animals are buried.
These impacts include alterations in the distribution and properties of natural sites of preservation, associated with shifts in land use and climate change; the production of novel sites for preservation, such as landfills and cemeteries; and changes in the breakdown of animal and human carcasses.
Additionally, the use of large agricultural equipment and increased domestic animal density due to intensive animal farming likely increases the rate of and changes the kind of damage to bones, according to the paleontologists.
“Fossil mammals occur in caves, ancient lakebeds and river channels, and are usually only teeth and isolated bones,” he said. “Animals that die on farms or in mass deaths due to disease often end up as complete corpses in trenches or landfills, far from water.”
Consequently, the fossils from the world today will be unique in the Earth’s history and unmistakable to paleontologists 100,000 years from now, according to the researchers.
“In the far future, the fossil record of today will have a huge number of complete hominid skeletons, all lined up in rows,” Plot nick said.
A new study led by Simon Fraser University’s Dean of Science, Prof. Paul Kench, has discovered new evidence of sea-level variability in the central Indian Ocean.
The study, which provides new details about sea levels in the past, concludes that sea levels in the central Indian Ocean have risen by close to a meter in the last two centuries.
Prof. Kench says, “We know that certain types of fossil corals act as important recorders of past sea levels. By measuring the ages and the depths of these fossil corals, we are identifying that there have been periods several hundred years ago that the sea level has been much lower than we thought in parts of the Indian Ocean.”
He says understanding where sea levels have been historically, and what happens as they rise, will provide greater insights into how coral reefs systems and islands may be able to respond to the changes in sea levels in the future.
Underscoring the serious threat posed to coastal cities and communities in the region, the ongoing study, which began in 2017, further suggests that if such acceleration continues over the next century, sea levels in the Indian Ocean will have risen to their highest level ever in recorded history.
Stem cells located in the bone marrow generate and control the production of blood and immune cells. Researchers from EMBL, DKFZ and HI-STEM have now developed new methods to reveal the three-dimensional organization of the bone marrow at the single cell level. Using this approach the teams have identified previously unknown cell types that create specific local environments required for blood generation from stem cells. The study, published in Nature Cell Biology, reveals an unexpected complexity of the bone marrow and its microdomains at an unprecedented resolution and provides a novel scientific basis to study blood diseases such as leukemias.
In the published study researchers from European Molecular Biology Laboratory (EMBL), the German Cancer Research Center (DKFZ) and the Heidelberg Institute for Stem Cell Technology and Experimental Medicine (HI-STEM gGmbH) present new methods permitting the characterisation of complex organs. The team focused their research on the murine bone marrow, as it harbours blood stem cells that are responsible for life-long blood production. Because of the ability to influence stem cells and to sustain blood production, there is a growing interest in exploiting the bone marrow environment, also called niche, as a target for novel leukemia treatments. “So far, very little was known about how different cells are organised within the bone marrow and how they interact to maintain blood stem cells,” explains Chiara Baccin, post-doc in the Steinmetz Group at EMBL. “Our approach unveils the cellular composition, the three-dimensional organisation and the intercellular communication in the bone marrow, a tissue that has thus far been difficult to study using conventional methods,” further explains Jude Al-Sabah, PhD student in the Haas Group at HI-STEM and DKFZ.
In order to understand which cells can be found in the bone marrow, where they are localised and how they might impact on stem cells, the researchers combined single-cell and spatial transcriptomics with novel computational methods. By analysing the RNA content of individual bone marrow cells, the team identified 32 different cell types, including extremely rare and previously unknown cell types. “We believe that these rare ‘niche cells’ establish unique environments in the bone marrow that are required for stem cell function and production of new blood and immune cells,” explains Simon Haas, group leader at the DKFZ and HI-STEM, and one of the initiators of the study.
Using novel computational methods, the researchers were not only able to determine the organisation of the different cell types in the bone marrow in 3D, but could also predict their cellular interactions and communication. “It’s the first evidence that spatial interactions in a tissue can be deduced computationally on the basis of genomic data,” explains Lars Velten, staff scientist in the Steinmetz Group.
“Our dataset is publicly accessible to any laboratory in the world and it could be instrumental in refining in vivo studies,” says Lars Steinmetz, group leader and director of the Life Science Alliance at EMBL Heidelberg. The data, which is now already used by different teams all over the world, is accessible via a user-friendly web app.
The developed methods can in principle be used to analyse the 3D organisation of any organ at the single cell level. “Our approach is widely applicable and could also be used to study the complex pathology of human diseases such as anemia or leukemia” highlights Andreas Trumpp, managing director of HI-STEM and division head at DKFZ.
In the last decade, scientists have made tremendous progress in understanding that groups of bacteria and viruses that naturally coexist throughout the human body play an important role in some vital functions like digestion, metabolism and even fighting off diseases. But understanding just how they do it remains a question.
Researchers from Drexel University are hoping to help answer that question through a clever combination of high-throughput genetic sequencing and natural language processing computer algorithms. Their research, which was recently published in the journal PLOS ONE, reports a new method of analyzing the codes found in RNA that can delineate human microbial communities and reveal how they operate.
Much of the research on the human microbial environment — or microbiome — has focused on identifying all of the different microbe species. And the nascent development of treatments for microbiota-linked maladies operates under the idea that imbalances or deviations in the microbiome are the source of health problems, such as indigestion or Crohn’s disease.
But to properly correct these imbalances it’s important for scientists to have a broader understanding of microbial communities as they exist — both in the afflicted areas and throughout the entire body.
“We are really just beginning to scrape the surface of understanding the health effects of microbiota,” said Gail Rosen, PhD, an associate professor in Drexel’s College of Engineering, who was an author of the paper. “In many ways scientists have jumped into this work without having a full picture of what these microbial communities look like, how prevalent they are and how their internal configuration affects their immediate environment within the human body.”
Rosen heads Drexel’s Center for Biological Discovery from Big Data, a group of researchers that has been applying algorithms and machine learning to help decipher massive amounts of genetic sequencing information that has become available in the last handful of years. Their work and similar efforts around the world have moved microbiology and genetics research from the wet lab to the data center — creating a computational approach to studying organism interactions and evolution, called metagenomics.
In this type of research, a scan of a genetic material sample — DNA or RNA — can be interpreted to reveal the organisms that are likely present. The method presented by Rosen’s group takes that one step farther by analyzing the genetic code to spot recurring patterns, an indication that certain groups of organisms — microbes in this case — are found together so frequently that it’s not a coincidence.
“We call this method ‘themetagenomics,’ because we are looking for recurring themes in microbiomes that are indicators of co-occurring groups of microbes,” Rosen said. “There are thousands of species of microbes living in the body, so if you think about all the permutations of groupings that could exist you can imagine what a daunting task it is to determine which of them are living in community with each other. Our method puts a pattern-spotting algorithm to work on the task, which saves a tremendous amount of time and eliminates some guesswork.”
Current methods for studying microbiota, gut bacteria for example, take a sample from an area of the body and then look at the genetic material that’s present. This process inherently lacks important context, according to the authors.
“It’s impossible to really understand what microbe communities are doing if we don’t first understand the extent of the community and how frequently and where else they might be occurring in the body,” said Steve Woloszynek, PhD, and MD trainee in Drexel’s College of Medicine and co-author of the paper. “In other words, it’s hard to develop treatments to promote natural microbial coexistence if their ‘natural state’ is not yet known.”
Obtaining a full map of microbial communities, using themetagenomics, allows researchers to observe how they change over time — both in healthy people and those suffering from diseases. And observing the difference between the two provides clues to the function of the community, as well as illuminating the configuration of microbe species that enables it.
“Most metagenomics methods just tell you which microbes are abundant — therefore likely important — but they don’t really tell you much about how each species is supporting other community members,” Rosen said. “With our method you get a picture of the configuration of the community — for example, it may have E. coli and B. fragilis as the most abundant microbes and in pretty equal numbers — which may indicate that they’re cross-feeding. Another community may have B. fragilis as the most abundant microbe, with many other microbes in equal, but lower, numbers — which could indicate that they are feeding off whatever B. fragilis is making, without any cooperation.”
One of the ultimate goals of analyzing human microbiota is to use the presence of certain microbe communities as indicators to identify diseases like Crohn’s or even specific types of cancer. To test their new method, the Drexel researchers put it up against similar topic modeling procedures that diagnose Crohn’s and mouth cancer by measuring the relative abundance of certain genetic sequences.
The themetagenomics method proved to be just as accurate predicting the diseases, but it does it much faster than the other topic modeling methods — minutes versus days — and it also teases out how each microbe species in the indicator community may contribute to the severity of the disease. With this level of granularity, researchers will be able to home in on particular genetic groupings when developing targeted treatments.
The group has made its themetagenomics analysis tools publicly available in hopes of speeding progress toward cures and treatments for these maladies.
“It’s very early right now, but the more that we understand about how the microbiome functions — even just knowing that groups may be acting together — then we can look into the metabolic pathways of these groups and intervene or control them, thus paving the way for drug development and therapy research,” Rosen said.
Spintronics or spin electronics in contrast to conventional electronics uses the spin of electrons for sensing, information storage, transport, and processing. Potential advantages are nonvolatility, increased data processing speed, decreased electric power consumption, and higher integration densities compared to conventional semiconductor devices. Molecular spintronics aims for the ultimate step towards miniaturization of spintronics by striving to actively control the spin states of individual molecules. Chemists and physicists at Kiel University joined forces with colleagues from France, and Switzerland to design, deposit and operate single molecular spin switches on surfaces. The newly developed molecules feature stable spin states and do not lose their functionality upon adsorption on surfaces. They present their results in the current issue of the journal Nature Nanotechnology.
The spin states of the new compounds are stable for at least several days. “This is achieved by a design trick that resembles the fundamental electronic circuits in computers, the so-called flip-flops. Bistability or switching between 0 and 1 is realized by looping the output signal back to the input,” says experimental physicist Dr. Manuel Gruber from Kiel University. The new molecules have three properties that are coupled with each other in such a feedback loop: their shape (planar or flat), the proximity of two subunits, called coordination (yes or no), and the spin state (high-spin or low-spin). Thus, the molecules are locked either in one or the other state. Upon sublimation and deposition on a silver surface, the switches self-assemble into highly ordered arrays. Each molecule in such an array can be separately addressed with a scanning tunneling microscope and switched between the states by applying a positive or negative voltage.
“Our new spin switch realizes in just one molecule what takes several components like transistors and resistors in conventional electronics. That’s a big step towards further miniaturisation,” Dr. Manuel Gruber und organic chemist Prof. Dr. Rainer Herges explain. The next step will be to increase the complexity of the compounds to implement more sophisticated operations.
Molecules are the smallest constructions that can be designed and built with atomic precision and predictable properties. Their response to electrical or optical stimuli and their custom-designed chemical and physical functionality make them unique candidates to develop new classes of devices such as controllable surface catalysts or optical devices.