Virus Versus Virus: ‘ReScan’ Antibody Test is a Powerful New Tool (Medicine)

Serological tests, also known as “antibody tests,” are one of the foundations of clinical medicine, with a particularly important role in infectious disease. They can detect active infections, but can also identify past exposures to infectious agents. These are both important pieces of information that can guide clinical care, and help explain why some people experience mild disease whereas others are much more severely affected.

During pandemics and epidemics, public health officials must identify individuals who are actively infected or have previously been infected, both to control the spread of disease and, in the case of vaccine-preventable illnesses, to prioritize individuals for vaccination. The SARS-CoV-2 pandemic has shone a bright light on the crucial need for accurate serological tests, especially now that vaccines are beginning to be deployed worldwide.

Now scientists led by Colin Zamecnik, PhD, a research scientist at UCSF and the Chan Zuckerberg Biohub, and UCSF Assistant Professor of Medicine Jayant Rajan, MD, PhD, have developed ReScan, an innovative new serological test that employs a specialized version of the technique known as phage display.

Colin Zamecnik (left), PhD, and Jayant Rajan, MD, PhD © UCSF

Building Phage Libraries

In research honored with a Nobel Prize in 2018, researchers showed that phage – a type of virus that attacks bacteria – can be made to incorporate chunks of a diverse range of genetic material and then “display” a corresponding protein on their surface. This technique has attracted considerable interest among scientists who study infectious diseases and autoimmune diseases, because it allows for the rapid development of massive protein libraries that can be used to perform fast, unbiased diagnostic screens of clinical samples from patients.

For example, by using phage to display protein fragments from the approximately 3,000 viruses known to infect humans, a UCSF/CZ Biohub research team led by Michael Wilson, MD, associate professor of neurology, showed that a mysterious polio-like illness that affects children is probably caused by a common viral agent. In another application, by creating a phage library of 700,000 human proteins, Wilson and Joe DeRisi, PhD, UCSF professor of biochemistry and biophysics and CZ Biohub co-president, led a team that identified a new autoimmune disease that affects men with testicular cancer.

In the new work, first authors Zamecnik and Rajan joined Wilson, DeRisi and colleagues, using the complete genome sequence of the SARS-CoV-2 virus to create a phage library representing  all of the virus’s proteins, which they were able to deploy in a clinical diagnostic format in just a matter of weeks.

“The versatility and quick turnaround of phage display libraries make them a great tool for fast-moving public health emergencies,” said Zamecnik, “and we were very pleased with ReScan’s performance in this environment.”

A key advantage of using phage display in this context, Zamecnik said, is that in addition to detecting COVID patients’ immune response to SARS-CoV-2 infection, the scientists could precisely identify exactly which parts of the virus are being most strongly targeted by the immune system.

Rapid, Adaptable Technology

As reported Oct. 20 in Cell Reports Medicine, to create ReScan, Zamecnik repurposed a special type of liquid-transfer robot called an ECHO to print test strips with hundreds of spots, each made up of phage displaying various SARS-CoV-2 proteins. These tests strips are a fresh take on the microarray technology originally developed by DeRisi to confirm that a coronavirus now known as SARS-CoV-1 was responsible for the first SARS outbreak in Asia in the early 2000s.

When individual patients’ blood samples are applied to the ReScan strips, antibodies in the blood bind to the phage making up the spots on the strips, each of which correspond to a particular part of the virus. By comparing results from many strips, researchers can determine which spots are most commonly bound by antibodies across many patients, and use this information to decide which viral proteins are the best candidates to incorporate into further tests to make the most accurate diagnoses. 

ReScan test strips from two patients reveal different antibody responses. Green dots represent viral protein sequences bound by antibodies in each patient. © UCSF

But one problem remained. “Our printing process for the strips was super-fast – 10 to 100 times faster than commercially available printers – but analyzing them was slow and tedious,” said Zamecnik. “What we really needed was a high-throughput image-processing system to scan lots of strips in parallel and identify the best ‘hot spots’ to concentrate on.”

Last November, Zamecnik described this bottleneck to bioengineer and image-analysis expert Kevin Yamauchi, PhD, a former Biohub scientist who is now a postdoctoral researcher at Biozentrum in Switzerland. Zamecnik and Yamauchi quickly drew up some specifications, and Yamauchi arrived at an elegant solution in the form of “dotblotr,” an open-source software package that quickly analyzes ReScan strips and reports the best SARS-CoV-2 phage to use.

With this information in hand, scientists can go back to the stock from which the strips were printed and easily make more tests that are focused just on the spots that elicited the strongest immune response. Phage can make copies of themselves, so they are an inexhaustible test resource – with an infinitesimal amount of the original phage, trillions more can be grown in a matter of hours with no complex procedures required.

The best diagnostic proteins identified by ReScan can be packaged into more standard lab tests, and in the Cell Reports Medicine paper, the authors report that the performance of one such test was similar to or better than other commercially available serology platforms.  

“Phage display, especially when combined with the printable microarray strips we’ve developed in ReScan, is a rapid and highly adaptable technology that can be easily retooled to meet the challenges posed by emerging infectious diseases of the future,” Zamecnik said.

Reference: Colin R. Zamecnik, Jayant V. Rajan, Kevin A. Yamauchi, Michel C. Nussenzweig, Joseph L. DeRisi, Michael R. Wilson et al., “ReScan, a Multiplex Diagnostic Pipeline, Pans Human Sera for SARS-CoV-2 Antigens”, Cell Reports Medicine, 1(7), 2021. https://www.cell.com/cell-reports-medicine/fulltext/S2666-3791(20)30165-8 https://doi.org/10.1016/j.xcrm.2020.100123

Provided by University of California San Francisco

Brain Cells Most Vulnerable to Alzheimer’s Disease Identified by Scientists (Neuroscience)

Findings Could Lead to Targeted Treatments to Boost Brain’s Resilience.

A major mystery in Alzheimer’s disease research is why some brain cells succumb to the creeping pathology of the disease years before symptoms first appear, while others seem impervious to the degeneration surrounding them until the disease’s final stages.

An image of human brain samples used to study why some brain cells are more vulnerable to Alzheimer’s disease than others. Image by Rana Eser, UCSF Grinberg lab

Now, in a study published Jan. 10, 2021, in Nature Neuroscience, a team of molecular biologists and neuropathologists from the UCSF Weill Institute for Neurosciences have joined forces to identify for the first time the neurons that are among the first victims of the disease – accumulating toxic “tangles” and dying off earlier than neighboring cells.  

“We know which neurons are first to die in other neurodegenerative diseases like Parkinson’s disease and ALS, but not Alzheimer’s,” said co-senior author Martin Kampmann, PhD, an associate professor in the UCSF Institute for Neurodegenerative Diseases and Chan Zuckerberg Biohub Investigator. “If we understood why these neurons are so vulnerable, maybe we could identify interventions that could make them, and the brain as a whole, more resilient to the disease.”  

Alzheimer’s researchers have long studied why certain cells are more prone to producing the toxic tangles of the protein known as tau, whose spread through the brain drives widespread cell death and the resulting progressive memory loss, dementia, and other symptoms. But researchers have not looked closely at whether all cells are equally vulnerable to the toxic effects of these protein accumulations.  

“The belief in the field has been that once these trash proteins are there, it’s always ‘game over’ for the cell, but our lab has been finding that that is not the case,” said Lea Grinberg, MD, the study’s other senior author, an associate professor and John Douglas French Alzheimer’s Foundation Endowed Professor in the UCSF Memory and Aging Center. “Some cells end up with high levels of tau tangles well into the progression of the disease, but for some reason don’t die. It has become a pressing question for us to understand the specific factors that make some cells selectively vulnerable to Alzheimer’s pathology, while other cells appear able to resist it for years, if not decades.” 

To identify selectively vulnerable neurons, the researchers studied brain tissue from people who had died at different stages of Alzheimer’s disease, obtained from the UCSF Neurodegenerative Disease Brain Bank and the Brazilian BioBank for Aging Studies, a unique resource co-founded by Grinberg. The São Paulo-based biobank collects tissue samples from a broad population of deceased individuals, including many without a neurological diagnosis whose brains nevertheless show signs of very early-stage neurodegenerative disease, which is otherwise very difficult to study in humans.  

First, led by Kampmann lab MD/PhD student Kun Leng and PhD student Emmi Li, the study’s co-first authors, the team studied tissue from 10 donor brains using a technique called single-nucleus RNA sequencing, which let them group neurons based on patterns of gene activity. In a brain region called the entorhinal cortex, one of the first areas attacked by Alzheimer’s, the researchers identified a particular subset of neurons that began to disappear very early on in the disease. Later on in the course of the disease, the researchers found, a similar group of neurons were also first to die off when degeneration reached the brain’s superior frontal gyrus.  

In both regions, these vulnerable cells were distinguished by their expression of a protein called RORB. This allowed researchers in Grinberg’s neuropathology lab, led by former lab manager Rana Eser, to examine RORB-expressing neurons in more detail in brain tissue from a larger cohort of 26 donors. They used histological staining techniques to examine the fate of cells from both healthy individuals and those with early and late stage Alzheimer’s. This work validated that RORB-expressing neurons do in fact die off early on in the disease and also accumulate tau tangles earlier than neighboring, non-RORB-expressing neurons. 

“These findings support the view that tau buildup is a critical driver of neurodegeneration, but we also know from other data from the Grinberg lab that not every cell that builds up these aggregates is equally susceptible,” said Leng, who plans continue to study factors underlying RORB neurons’ selective vulnerability using CRISPR-based technology the Kampmann lab has developed.  

It’s not clear whether RORB itself causes the cells’ selective vulnerability, the researchers said, but the protein provides a valuable new molecular “handle” for future studies to understand what makes these cells succumb to Alzheimer’s pathology, and how their vulnerability could potentially be reversed.  

“Our discovery of a molecular identifier for these selectively vulnerable cells gives us the opportunity to study in detail exactly why they succumb to tau pathology, and what could be done to make them more resilient,” Leng said. “This would be a totally new and much more targeted approach to developing therapies to slow or prevent the spread of Alzheimer’s disease.”  

Authors: Grinberg and Kampmann are both faculty members of the Department of Neurology in the UCSF Weill Institute for Neurosciences. Additional authors on the study were Antonia Piergies, Rene Sit, Michelle Tan, Norma Neff, Song Hua Li, Alexander Ehrenberg, William W. Seeley, and Salvatore Spina of UCSF; Roberta Diehl Rodriguez, Claudia Kimie Suemoto, Renata Elaine Paraizo Leite, and Carlos A. Pasqualucci of Universidade de São Paulo in Brazil; and Helmut Heinsen, of the Universidade de São Paulo and University of Würzburg in Germany. 

Funding: The research was supported by the U.S. National Institutes of Health (NIH) (F30 AG066418, K08 AG052648, R56 AG057528, K24 AG053435, U54 NS100717); an NDSEG fellowship; Alzheimer’s Association fellowship (AARF 18-566005); Fundação de Amparo à Pesquisa do Estado de São Paulo and Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (FAPESP/CAPES 2016/24326-0); and a Chan Zuckerberg Biohub Investigator Award. The UCSF Neurodegenerative Disease Brain Bank is supported by the NIH (AG023501, AG019724) the Tau Consortium, and the Bluefield Project to Cure FTD.

Reference: Leng, K., Li, E., Eser, R. et al. Molecular characterization of selectively vulnerable neurons in Alzheimer’s disease. Nat Neurosci (2021). https://www.nature.com/articles/s41593-020-00764-7 https://doi.org/10.1038/s41593-020-00764-7

Provided by University of California San Francisco

Eliminating Microplastics in Wastewater Directly at the Source (Chemistry)

A research team from INRS has developed a process for the electrolytic treatment of wastewater that degrades microplastics at the source.

Professor Patrick Drogui’s research team has developed a process for the electrolytic treatment of wastewater that degrades microplastics at the source.The results of this research have been published in the Environmental Pollution journal.

Electro-analytical system used to identify appropriate electrodes for anodic oxidation processes. © INRS

Wastewater can carry high concentrations of microplastics into the environment. These small particles of less than 5 mm can come from our clothes for clothes, usually as microfibers. Professor Patrick Drogui, who led the study, points out there are currently no established degradation methods to handle this contaminant during wastewater treatment. Some techniques already exist, but they often involve physical separation as a means of filtering pollutants. These technologies do not degrade them, which requires additional work to manage the separated particles.

Therefore, the research team decided to degrade the particles by electrolytic oxidation, a process not requiring the addition of chemicals. “Using electrodes, we generate hydroxyl radicals (·OH) to attack microplastics. This process is environmentally friendly because it breaks them down into CO2 and water molecules, which are non-toxic to the ecosystem,” explains the researcher. The electrodes used in this process are more expensive than iron or steel electrodes, which degrade over time, but can be reused for several years.


An effective treatment

Professor Drogui envisions the use of this technology at the exit of commercial laundries, a potential source of microplastics release into the environment. “When this commercial laundry water arrives at the wastewater treatment plant, it is mixed with large quantities of water, the pollutants are diluted and therefore more difficult to degrade. Conversely, by acting at the source, i.e., at the laundry, the concentration of microplastics is higher (per litre of water), thus more accessible for electrolytic degradation,” explains the specialist in electrotechnology and water treatment.

Laboratory tests conducted on water artificially contaminated with polystyrene showed a degradation efficiency of 89%. The team plans to move on to experiments on real water. “Real water contains other materials that can affect the degradation process, such as carbonates and phosphates, which can trap radicals and reduce the performance of the oxidation process,” says Professor Drogui, scientific director of the Laboratory of Environmental Electrotechnologies and Oxidative Processes (LEEPO).

If the technology demonstrates its effectiveness on real commercial laundry water, the research group intends to conduct a study to determine the cost of treatment and the adaptation of the technology to treat larger quantities of wastewater. Within a few years, the technology could be implemented in laundry facilities.


About the study

The article “Treatment of microplastics in water by anodic oxidation: A case study for polystyrene”, by Marthe Kiendrebeogo, Mahmoodreza Karimiestahbanati, Ali Khosravanipour Mostafazadeh, Patrick Drogui and Rajeshwar Dayal Tyagi, was published in the Environmental Pollution journal. The team received financial support from the Fonds de recherche du Québec – Nature et technologies (FRQNT), the CREATE-TEDGIEER program, the Natural Sciences and Engineering Research Council of Canada (NSERC) and the Canadian Francophonie Scholarship Program (CFSP).

Reference: Marthe Kiendrebeogo, M.R. Karimi Estahbanati, Ali Khosravanipour Mostafazadeh, Patrick Drogui, R.D. Tyagi, Treatment of microplastics in water by anodic oxidation: A case study for polystyrene, Environmental Pollution, Volume 269, 2021, 116168, ISSN 0269-7491, https://doi.org/10.1016/j.envpol.2020.116168. (http://www.sciencedirect.com/science/article/pii/S0269749120368573)

Provided by INRS

UCI Researchers: Climate Change Will Alter the Position of the Earth’s Tropical Rain Belt (Earth Science)

Difference by the year 2100 expected to impact global biodiversity, food security.

Future climate change will cause a regionally uneven shifting of the tropical rain belt – a narrow band of heavy precipitation near the equator – according to researchers at the University of California, Irvine and other institutions. This development may threaten food security for billions of people.

Roughly in line with the equator, Earth’s tropical rain belt is expected to shift irregularly in large hemispheric zones as a result of future climate change, according to a new study by UCI civil & environmental engineering and Earth systems science researchers. The alterations are expected to cause droughts and threaten biodiversity and food security across broad swaths of the planet by the year 2100. NASA

In a study published today in Nature Climate Change, the interdisciplinary team of environmental engineers, Earth system scientists and data science experts stressed that not all parts of the tropics will be affected equally. For instance, the rain belt will move north in parts of the Eastern Hemisphere but will move south in areas in the Western Hemisphere.

According to the study, a northward shift of the tropical rain belt over the eastern Africa and the Indian Ocean will result in future increases of drought stress in southeastern Africa and Madagascar, in addition to intensified flooding in southern India. A southward creeping of the rain belt over the eastern Pacific Ocean and Atlantic Ocean will cause greater drought stress in Central America.

“Our work shows that climate change will cause the position of Earth’s tropical rain belt to move in opposite directions in two longitudinal sectors that cover almost two thirds of the globe, a process that will have cascading effects on water availability and food production around the world,” said lead author Antonios Mamalakis, who recently received a Ph.D. in civil & environmental engineering in the Henry Samueli School of Engineering at UCI and is currently a postdoctoral fellow in the Department of Atmospheric Science at Colorado State University.

The team made the assessment by examining computer simulations from 27 state-of-the-art climate models and measuring the tropical rain belt’s response to a future scenario in which greenhouse gas emissions continue to rise through the end of the current century.

Mamalakis said the sweeping shift detected in his work was disguised in previous modelling studies that provided a global average of the influence of climate change on the tropical rain belt. Only by isolating the response in the Eastern and Western Hemisphere zones was his team able to highlight the drastic alterations to come over future decades.

Co-author James Randerson, UCI’s Ralph J. & Carol M. Cicerone Chair in Earth System Science, explained that climate change causes the atmosphere to heat up by different amounts over Asia and the North Atlantic Ocean.

“In Asia, projected reductions in aerosol emissions, glacier melting in the Himalayas and loss of snow cover in northern areas brought on by climate change will cause the atmosphere to heat up faster than in other regions,” he said. “We know that the rain belt shifts toward this heating, and that its northward movement in the Eastern Hemisphere is consistent with these expected impacts of climate change.”

He added that the weakening of the Gulf Stream current and deep-water formation in the North Atlantic is likely to have the opposite effect, causing a southward shift in the tropical rain belt across the Western Hemisphere.

“The complexity of the Earth system is daunting, with dependencies and feedback loops across many processes and scales,” said corresponding author Efi Foufoula-Georgiou, UCI Distinguished Professor of Civil & Environmental Engineering and the Henry Samueli Endowed Chair in Engineering. “This study combines the engineering approach of system’s thinking with data analytics and climate science to reveal subtle and previously unrecognized manifestations of global warming on regional precipitation dynamics and extremes.”

Foufoula-Georgiou said that a next step is to translate those changes to impacts on the ground, in terms of flooding, droughts, infrastructure and ecosystem change to guide adaptation, policy and management.

Other collaborators of this study, which was funded by the National Science Foundation, included Jin-Yi Yu, Gudrun Magnusdottir and Michael Pritchard and Padhraic Smyth at UCI; Paul Levine at NASA’s Jet Propulsion Laboratory, and Sungduk Yu at Yale University.

Reference: Mamalakis, A., Randerson, J.T., Yu, JY. et al. Zonally contrasting shifts of the tropical rain belt in response to climate change. Nat. Clim. Chang. (2021). https://doi.org/10.1038/s41558-020-00963-x https://www.nature.com/articles/s41558-020-00963-x

Provided by University of California, Irvine

About the University of California, Irvine: Founded in 1965, UCI is the youngest member of the prestigious Association of American Universities. The campus has produced three Nobel laureates and is known for its academic achievement, premier research, innovation and anteater mascot. Led by Chancellor Howard Gillman, UCI has more than 36,000 students and offers 222 degree programs. It’s located in one of the world’s safest and most economically vibrant communities and is Orange County’s second-largest employer, contributing $5 billion annually to the local economy. For more on UCI, visit www.uci.edu.

A New Archeology For the Age of Man (Archeology)

Scantily clad grave robbers and silent scholars assembling pottery shards – these stereotypes dominate the public perception of archeology. But there are worlds between these images and the archeology of the 21st century. In a broad overview, scientists from the Max Planck Institute for the History of Human History explore a thoroughly modern scientific discipline and show what contribution archeology can make to overcoming the enormous challenges of the Anthropocene.

Information from disciplines such as archeology, history, historical ecology and paleoecology play an important role in designing sustainable solutions to the challenges of the Anthropocene.
Michelle O’Reilly, MPI-SHH

Indiana Jones and Lara Croft have a lot to answer for. The public perception of archeology is often completely out of date, and these characters do little to correct the picture.

Archeology, as it is practiced today, has almost no resemblance to the looting of graves depicted in films and video games. It bears little resemblance even to the more scientific representations in entertainment.

An article published today in Nature Ecology and Evolution aims to stimulate thought by an audience that has been largely prepared to take these unrealistic representations at face value. It shows an archeology that is carried out by scientists in white lab coats with the help of equipment worth millions and the latest computer technology.

The work also describes an archeology that can make a significant contribution to tackling modern challenges through and through, such as the preservation of biological diversity, food security and climate change.

“Archeology is a completely different discipline today than it was a century ago,” explains Nicole Boivin, lead author of the study and director of the institute’s archeology department. “The grave robbery as we see it in films is exaggerated. But archeology itself was probably more similar to it in the past than to archeology today. In contrast, much of today’s archeology is very scientifically oriented and aims at solving problems of the world this time. “

All over the world today we find many examples of how previous cultural and technological practices and solutions are being revived to address the pressing challenges of environmental and land management.  Examples of this are (from left to right) the mobilization of the old Terra-Preta (anthropogenic dark earth) technology, the revival of the “Landesque Capital” (landscape capital created through long-term landscape investments) and the use of traditional fire protection systems. Michelle O’Reilly, MPI-SHH

By looking at the research contributions in the field over the past few decades, the authors come to a clear conclusion – modern archeology has contributed a lot to overcoming current challenges.

“Today man has become one of the great forces that shape nature,” emphasizes Alison Crowther, co-author and scientist at the University of Queensland and at the MPI for the History of Man. “In saying we have entered a new, human-dominated geological age, the Anthropocene, we recognize that role.”

How can archeology, a discipline that focuses on the past, try to meet the challenges of the Anthropocene?

“It is clear that the past offers a vast repertoire of cultural knowledge that we cannot ignore,” says Professor Boivin.

By analyzing what worked in the past and what did not, archaeological research – by which it evaluates long-term experiments with human societies, so to speak – gains insights into the factors that promote sustainability and resilience or oppose them.

The authors also point out how solutions from the past help to cope with today’s problems. “By harnessing information on how people in the past enriched soils, prevented destructive fires, created greener cities, and transported fossil-free water, scientists have helped improve the modern world,” explains Dr. Crowther.

Archaeological studies of agricultural cities with a low population density, such as ancient Angkor Wat in Cambodia, are increasingly used to support the development of more sustainable urban centers. © Alison Crowther

People still use and adapt technologies and infrastructures, including terrace and irrigation systems, which in some cases are centuries or even millennia old.

However, the scientists would particularly like to emphasize the continued importance of technological and social solutions to climate change and the other challenges of the Anthropocene.

“It’s not about glorifying the past or vilifying progress,” says Professor Boivin. “Instead, it is about bringing together the best of the past, present and future in order to develop a responsible and constructive course for humanity.”

Reference: Nicole Boivin and Alison Crowther, “Mobilizing the past to shape a better Anthropocene”, Nature Ecology & Evolution, 2021.
DOI 10.1038 / s41559-020-01361-4

Provided by Max Planck Institute for the Science of Human History

A ‘Super-puff’ Planet Like No Other (Planetary Science)

The core mass of the giant exoplanet WASP-107b is much lower than what was thought necessary to build up the immense gas envelope surrounding giant planets like Jupiter and Saturn, astronomers at Université de Montréal have found.

Artistic rendition of the exoplanet WASP-107b and its star, WASP-107. Some of the star’s light streams through the exoplanet’s extended gas layer. © ESA/Hubble, NASA, M. Kornmesser.

This intriguing discovery by Ph.D. student Caroline Piaulet of UdeM’s Institute for Research on Exoplanets (iREx) suggests that gas-giant planets form a lot more easily than previously believed.

Piaulet is part of the groundbreaking research team of UdeM astrophysics professor Björn Benneke that in 2019 announced the first detection of water on an exoplanet located in its star’s habitable zone.

Published today in the Astronomical Journal with colleagues in Canada, the U.S., Germany and Japan, the new analysis of WASP-107b’s internal structure “has big implications,” said Benneke.

“This work addresses the very foundations of how giant planets can form and grow,” he said. “It provides concrete proof that massive accretion of a gas envelope can be triggered for cores that are much less massive than previously thought.”

As big as Jupiter but 10 times lighter

WASP-107b was first detected in 2017 around WASP-107, a star about 212 light years from Earth in the Virgo constellation. The planet is very close to its star — over 16 times closer than the Earth is to the Sun. As big as Jupiter but 10 times lighter, WASP-107b is one of the least dense exoplanets known: a type that astrophysicists have dubbed “super-puff” or “cotton-candy” planets.

Piaulet and her team first used observations of WASP-107b obtained at the Keck Observatory in Hawai’i to assess its mass more accurately. They used the radial velocity method, which allows scientists to determine a planet’s mass by observing the wobbling motion of its host star due to the planet’s gravitational pull. They concluded that the mass of WASP-107b is about one tenth that of Jupiter, or about 30 times that of Earth.

The team then did an analysis to determine the planet’s most likely internal structure. They came to a surprising conclusion: with such a low density, the planet must have a solid core of no more than four times the mass of the Earth. This means that more than 85 percent of its mass is included in the thick layer of gas that surrounds this core. By comparison, Neptune, which has a similar mass to WASP-107b, only has 5 to 15 percent of its total mass in its gas layer.

“We had a lot of questions about WASP-107b,” said Piaulet. “How could a planet of such low density form? And how did it keep its huge layer of gas from escaping, especially given the planet’s close proximity to its star?

“This motivated us to do a thorough analysis to determine its formation history.”

A gas giant in the making

Planets form in the disc of dust and gas that surrounds a young star called a protoplanetary disc. Classical models of gas-giant planet formation are based on Jupiter and Saturn. In these, a solid core at least 10 times more massive than the Earth is needed to accumulate a large amount of gas before the disc dissipates.

Without a massive core, gas-giant planets were not thought able to cross the critical threshold necessary to build up and retain their large gas envelopes.

How then do explain the existence of WASP-107b, which has a much less massive core? McGill University professor and iREx member Eve Lee, a world-renowned expert on super-puff planets like WASP-107b, has several hypotheses.

“For WASP-107b, the most plausible scenario is that the planet formed far away from the star, where the gas in the disc is cold enough that gas accretion can occur very quickly,” she said. “The planet was later able to migrate to its current position, either through interactions with the disc or with other planets in the system.”

Discovery of a second planet, WASP-107c

The Keck observations of the WASP-107 system cover a much longer period of time than previous studies have, allowing the UdeM-led research team to make an additional discovery: the existence of a second planet, WASP-107c, with a mass of about one-third that of Jupiter, considerably more than WASP-107b’s.

WASP-107c is also much farther from the central star; it takes three years to complete one orbit around it, compared to only 5.7 days for WASP-107b. Also interesting: the eccentricity of this second planet is high, meaning its trajectory around its star is more oval than circular.

“WASP-107c has in some respects kept the memory of what happened in its system,” said Piaulet. “Its great eccentricity hints at a rather chaotic past, with interactions between the planets which could have led to significant displacements, like the one suspected for WASP-107b.”

Several more questions

Beyond its formation history, there are still many mysteries surrounding WASP-107b. Studies of the planet’s atmosphere with the Hubble Space Telescope published in 2018 revealed one surprise: it contains very little methane.

“That’s strange, because for this type of planet, methane should be abundant,” said Piaulet. “We’re now reanalysing Hubble’s observations with the new mass of the planet to see how it will affect the results, and to examine what mechanisms might explain the destruction of methane.”

The young researcher plans to continue studying WASP-107b, hopefully with the James Webb Space Telescope set to launch in 2021, which will provide a much more precise idea of the composition of the planet’s atmosphere.

“Exoplanets like WASP-107b that have no analogue in our Solar System allow us to better understand the mechanisms of planet formation in general and the resulting variety of exoplanets,” she said. “It motivates us to study them in great detail.”

Reference: Caroline Piaulet, Björn Benneke, Ryan A. Rubenzahl, Andrew W. Howard et al., “WASP-107b’s density is even lower: a case study for the physics of gas envelope accretion and orbital migration,” Astrophysical Journal, 161(2), 2021. https://iopscience.iop.org/article/10.3847/1538-3881/abcd3c

Provided by University of Montréal

New Discovery in Breast Cancer Treatment (Medicine)

Researchers at the University of Adelaide have found new evidence about the positive role of androgens in breast cancer treatment with immediate implications for women with estrogen receptor-driven metastatic disease.

Androgen counterbalances estrogen-driven breast cancer © University of Adelaide

Published today in Nature Medicine, the international study conducted in collaboration with the Garvan Institute of Medical Research, looked at the role of androgens – commonly thought of as male sex hormones but also found at lower levels in women – as a potential treatment for estrogen receptor positive breast cancer.

Watch a video explainer about the new study at – https://youtu.be/NYalzv4C35U

In normal breast development, estrogen stimulates and androgen inhibits growth at puberty and throughout adult life. Abnormal estrogen activity is responsible for the majority of breast cancers, but the role of androgen activity in this disease has been controversial.

Androgens were historically used to treat breast cancer, but knowledge of hormone receptors in breast tissue was rudimentary at the time and the treatment’s efficacy misunderstood. Androgen therapy was discontinued due to virilising side effects and the advent of anti-estrogenic endocrine therapies.

While endocrine therapy is standard-of-care for estrogen receptor positive breast cancer, resistance to these drugs are the major cause of breast cancer mortality.

Professor Wayne Tilley © University of Adelaide

Professor Wayne Tilley, Director of the Dame Roma Mitchell Cancer Research Laboratories, and Associate Professor Theresa Hickey, Head of the Breast Cancer Group, who led the study say the need for alternative treatment strategies has renewed interest in androgen therapy for breast cancer.

However, previous studies had produced conflicting evidence on how best to therapeutically target the androgen receptor for treatment of breast cancer, which caused widespread confusion and hampered clinical application.

Using cell-line and patient-derived models, a global team, including researchers at the University of Adelaide and the Garvan Institute, demonstrated that androgen receptor activation by natural androgen or a new androgenic drug had potent anti-tumour activity in all estrogen receptor positive breast cancers, even those resistant to current standard-of-care treatments. In contrast, androgen receptor inhibitors had no effect.

“This work has immediate implications for women with metastatic estrogen receptor positive breast cancer, including those resistant to current forms of endocrine therapy,” said Associate Professor Theresa Hickey.

Associate Professor Theresa Hickey © University of Adelaide

Professor Tilley added: “We provide compelling new experimental evidence that androgen receptor stimulating drugs can be more effective than existing (e.g. Tamoxifen) or new (e.g. Palbociclib) standard-of-care treatments and, in the case of the latter, can be combined to enhance growth inhibition.

Moreover, currently available selective androgen receptor activating agents lack the undesirable side effects of natural androgens, and can confer benefits in women including promotion of bone, muscle and mental health.

Associate Professor Elgene Lim, a breast oncologist and Head of the Connie Johnson Breast Cancer Research Lab at the Garvan Institute, said: “The new insights from this study should clarify the widespread confusion over the role of the androgen receptor in estrogen receptor driven breast cancer. Given the efficacy of this treatment strategy at multiple stages of disease in our study, we hope to translate these findings into clinical trials as a new class of endocrine therapy for breast cancer.”

Dr Stephen Birrell, a breast cancer specialist and pioneer in androgens and women’s health who was part of the Adelaide based team, pointed out that this seminal finding has application beyond the treatment of breast cancer, including breast cancer prevention and treatment of other disorders also driven by estrogen.

Chloe Marshall, 33, has a breast cancer recurrence while pregnant with her second child. She said that endocrine therapy has terrible side effects and there was an urgent need for better options to prevent and treat breast cancer recurrence.

“I was diagnosed with a hormone positive breast cancer in July 2017 and subsequently found out I carried the BRACA gene,” she said.

“I underwent a double mastectomy and neo adjuvant chemotherapy followed by two years of hormone suppressive treatment. The hormone suppressive treatment that I experienced was one of the hardest parts of having cancer. The impact it has on your mind/life/body is incredibly challenging.

“Now, three years later, I find myself with a recurrent cancer while 25 weeks pregnant. The thought of having hormone suppressive treatment for a further five to ten years is overwhelming.

“I think this study will help patients like myself have hope that there is another answer to life after the cancer diagnosis.”

An international Phase 3 registration clinical trial (sponsored by VERU, Inc) evaluating Enobosarm, an androgen receptor activating agent, in patients with androgen receptor and estrogen receptor positive metastatic breast cancer who failed endocrine therapy and a CDK 4/6 inhibitor (e.g. palbociclib), will commence in the second quarter of 2021.

This work was funded by grants from the National Health and Medical Research Council of Australia, the National Breast Cancer Foundation, Cancer Australia, Movember, The Hospital Research Foundation and the US Department of Defense Breast Cancer Research Program.

Reference: Hickey, T.E., Selth, L.A., Chia, K.M. et al. The androgen receptor is a tumor suppressor in estrogen receptor–positive breast cancer. Nat Med (2021). https://www.nature.com/articles/s41591-020-01168-7 https://doi.org/10.1038/s41591-020-01168-7

Provided by University of Adelaide

Six-Wavelength Spectroscopy Can Offer New Details of Surface of Venus (Planetary Science)

A trio of papers provide new insight into the composition and evolution of the surface of Venus, hidden beneath its caustic, high temperature atmosphere. Utilizing imaging from orbit using multiple wavelengths – six-band spectroscopy proposed as part of the VERITAS and EnVision missions – scientists can map the iron content of the Venusian surface and construct the first-ever geologic map.

This image of Venus is a composite of data from NASA’s Magellan spacecraft and Pioneer Venus Orbiter. Credit: NASA/JPL-Caltech

“Previous missions have only imaged one wavelength, and used 30-year-old topographic data to correct the spectra. Moreover, they were based on theoretical ideas about what Venus spectra look like, at very high temperatures. So the prior data have all been fairly qualitative,” said M. Darby Dyar, a Senior Scientist at the Planetary Science Institute and author on three recent papers on the topic. 

These papers are  based on new data from the Planetary Spectroscopy Laboratory at German Aerospace Center Institute of Planetary Research in Berlin, where Dyar works with a team including Jörn Helbert, first author of “Deriving iron contents from past and future Venus surface spectra with new high-temperature laboratory emissivity data” that appears today in Science Advances. That lab is unique in the world in that it can acquire spectra of rocks and minerals at Venus conditions. The new data lay the groundwork for the next planned Venus missions in order to finally be able to determine the different rocks there. 

Dyar is lead author on two recent papers published prior to today’s Science Advances paper: “Surface weathering on Venus: Constraints from kinetic, spectroscopic, and geochemical data” in Icarus on which PSI Senior Scientist Elizabeth C. Sklute is a co-author, and “Probing Venus Surface Iron Contents With Six‐Band Visible Near‐Infrared Spectroscopy From Orbit” that appears in Geophysical Research Letters

“We know very little about the geology of the Venus surface. What little we know comes  from Soviet landers in the 1970s and from radar data from the Magellan mission, which ended in 1996. Until recently, there were no orbital spectroscopic data for Venus –  as are common on other planets – because Venus is covered by thick, COclouds. Visible and near-infrared (VNIR) light cannot penetrate those clouds except in some very small windows in the NIR around a wavelength of 1 micron,” Dyar said. 

“But now we have acquired spectra in our ‘Venus chamber’. We can sample those data to match the windows in the CO2 atmosphere that an orbiter might have,” Dyar said. “There are five windows, into which we can fit six spectral bands – the one that was used by Venus Express plus five more. A six-window spectrometer that we designed is being proposed as part of two missions: the US-led VERITAS mission and the ESA mission called EnVision. 

“The new lab data allow us to develop machine learning algorithms that can measure the iron contents of surface rocks from orbit with high accuracy. This is important because key igneous rocks types have distinctive iron contents, so we’ll be able to distinguish basalt, andesite, dacite, and rhyolites on the surface. Knowledge of rock types informs our understanding of how the Venus surface evolved,” she said. 

“Our Science Advances paper essentially ‘validates’ the new lab data by showing that they match spectra taken on the surface by the Soviet landers in the 1970s. They also allow us to measure the iron contents of the basalts at two of those soviet landing sites, providing modern chemical data for previously unknown rocks” Dyar said. 

“The Geophysical Research Letters paper more explicitly shows how we can determine FeO contents of Venus surface rocks using just the information in those six spectral bands. This means that if those missions get selected, we will be able to make a geologic map of the Venus surface from orbit using the FeO contents of the rocks,” Dyar said. “And the Icarus paper looks at the issue of recent volcanism on the Venus surface and suggests that contrary to prior work conducted under non-Venus conditions, the surface of Venus alters surprisingly slowly in its caustic atmosphere. This means that the surface won’t be covered in a single mineral, as prior work has suggested. Our paper gives estimates for how fast the surface will alter, and in turn constrains the age of the Venus surface.” 

Dyar’s work on this research was partially funded by NASA’s Planetary Instrument Concepts for the Advancement of Solar System Observations (PICASSO) program. 

Provided by Planetary Science Institute

New Paper: What Makes an Explanation Good Enough? (Psychology)

“If you look at the biggest and most divisive arguments we’re having right now,” says Simon DeDeo, SFI External Professor and Carnegie Mellon University Professor, “we often agree on the facts. We disagree on the explanations.”

Left: Conspiracy meme “Pepe Silva”; Right: Simon DeDeo at the Santa Fe Institute. (Image: Michael Garfield/Santa Fe Institute)

What makes an explanation good enough? As a personal matter, people have different answers to this question, and not all of them agree, says a recent paper in Trends in Cognitive Sciences by DeDeo and first author Zachary Wojtowicz (Carnegie Mellon). The authors use Bayes’ Rule, a famous theorem in probability and statistics, to investigate what we value in scientific and moral explanations. 

Simplicity is one feature of a good explanation. In the sciences, for example, some people strive for a principle of simplicity known as “parsimony,” eg: “What is the shortest program I can write to produce these results?” Others prefer what Wojtowicz and DeDeo call “co-explanation,” which looks to solve multiple puzzles with one answer, then evaluates the results in the light of new revelations. In physics, this often manifests as a search for a unifying principle or mechanism acting through seemingly disparate patterns.

But the most enchantingly simple explanations can clash with complex ones that account for the world more fully. Part of the challenge is that explanation and prediction aren’t the same; deep learning algorithms might predict the near future with uncanny accuracy, but can’t explain the models they produce to do so. “Theories of Everything” can fail to predict the world outside their training data sets.

Conspiracy theories are a great example of how a preference for what the authors call “co-explanations” can go awry. Take the Oklahoma City bomber Timothy McVeigh, who was arrested shortly after the bombing because he was driving without a license plate with a loaded gun in the passenger seat. People had trouble understanding how a criminal mastermind could be so careless, and it was easier for some to believe that McVeigh was a scapegoat for an elite conspiracy. 

“Conspiracy theories are in many ways great explanations,” says DeDeo. “Where conspiracy theorists go wrong is often in a lopsided aesthetic,” neglecting the common sense of a domain and over-emphasizing the values associated with, for example, “unifying” explanations.

All-encompassing answers, by the virtue of neglecting true complexity, have a price.

“Explanations, when they work, enchant us,” DeDeo says, citing James Clerk Maxwell’s theory of electromagnetism as an exceedingly useful explanation that “unifies” two invisible forces. But that same enchantment can also be abused when one aesthetic dominates.

What ultimately keeps anyone honest, according to DeDeo, is interacting with other people with different ideas of what constitutes a satisfying explanation. 

Read the paper, “From Probability to Consilience: How Explanatory Values Implement Bayesian Reasoning,” in Trends in Cognitive Science (October 23, 2020)

Provided by Santa Fe Institute