More than a century ago, scientists proved that carbon dioxide in Earth’s atmosphere could act like a thermostat — adding more CO2 would turn up the heat, removing it would chill the planet. But back then, most scientists thought that Earth’s climate system was far too large and stable to change quickly, that any fluctuations would happen over such a long timescale that it wouldn’t matter much to everyday life (SN: 3/12/22, p. 16).
Now all it takes is a look at the Weather Channel to know how wrong scientists were. Things are changing fast. Last year alone, Europe, South Asia, China, Japan and the American West endured deadly, record-breaking heat waves (SN: 12/17/22 & 12/31/22, p. 38). As I write this, torrential rains are bringing death and destruction to California. And with levels of climate-warming gases continuing to increase in the atmosphere, extreme weather events will become even more frequent. Given the vastness of this threat, it’s tempting to think that any efforts that we make against it will be futile. But that’s not true. Around the world, scientists and engineers; entrepreneurs and large corporations; state, national and local governments; and international coalitions are acting to put the brakes on climate change. Last year, the United States signed into law a $369 billion investment in renewable energy technologies and other responses (SN: 12/17/22 & 12/31/22, p. 28). And the World Bank invested $31.7 billion to assist other countries.
In this issue, contributing correspondent Alexandra Witze details the paths forward: which responses will help the most, and which remain challenging. Shifting to renewable energy sources like wind and solar should be the easiest. We already have the technology, and costs have plunged over the last decade. Other approaches that are feasible but not as far along include making industrial processes more energy efficient, trapping greenhouse gases and developing clean fuels. Ultimately, the goal is to reinvent the global energy infrastructure. Societies have been retooling energy infrastructures for centuries, from water and steam power to petroleum and natural gas to nuclear power and now renewables. This next transformation will be the biggest yet. But we have the scientific understanding and technological savvy to make it happen.
This cover story kicks off a new series for Science News, The Climate Fix. In future issues, we will focus on covering solutions to the climate crisis, including the science behind innovations, the people making them happen, and the social and environmental impacts. You’ll also see expanded climate coverage for our younger readers, ages 9 and up, at Science News Explores online and in print.
With this issue, we also welcome our new publisher, Michael Gordon Voss. He comes to us with deep knowledge of the media industry, experience in both for-profit and nonprofit publishing and a love of science. Before joining Science News Media Group, Voss was publisher of Stanford Social Innovation Review, and vice president and associate publisher at Scientific American. With his arrival, publisher Maya Ajmera takes on her new role as executive publisher. Under her leadership, we have seen unprecedented growth. We’re fortunate to have these two visionaries directing our business strategy amid a rapidly changing media environment.
In full swing The swaying feeling in jazz music that compels feet to tap may arise from near-imperceptible delays in musicians’ timing, Nikk Ogasa reported in “Jazz gets its swing from small, subtle delays” (SN: 11/19/22, p. 5).
Reader Oda Lisa, a self-described intermediate saxophonist, has noticed these subtle delays while playing.“I recorded my ‘jazzy’ version of a beloved Christmas carol, which I sent to a friend of mine,” Lisa wrote. “She praised my effort overall, but she suggested that I get a metronome because the timing wasn’t consistent. My response was that I’m a slave to the rhythm that I hear in my head. I think now I know why.” On the same page Murky definitions and measurements impede social science research, Sujata Gupta reported in “Fuzzy definitions mar social science” (SN: 11/19/22, p. 10).
Reader Linda Ferrazzara found the story thought-provoking. “If there’s no consensus on the terms people use … then there can be no productive discussion or conversation. People end up talking and working at cross-purposes with no mutual understanding or progress,” Ferrazzara wrote.
Fly me to the moon Space agencies are preparing to send the next generation of astronauts to the moon and beyond. Those crews will be more diverse in background and expertise than the crews of the Apollo missions, Lisa Grossman reported in “Who gets to go to space?” (SN: 12/3/22, p. 20).
“It is great to see a broader recognition of the work being done to make spaceflight open to more people,” reader John Allen wrote. “Future space travel will and must accommodate a population that represents humanity. It won’t be easy, but it will be done.”
The story also reminded Allen of the Gallaudet Eleven, a group of deaf adults who participated in research done by NASA and the U.S. Navy in the 1950s and ’60s. Experiments tested how the volunteers responded (or didn’t) to a range of scenarios that would typically induce motion sickness, such as a ferry ride on choppy seas. Studying how the body’s sensory systems work without the usual gravitational cues from the inner ear allowed scientists to better understand motion sickness and the human body’s adaptation to spaceflight.
Sweet dreams are made of this A memory-enhancing method that uses sound cues may boost an established treatment for debilitating nightmares, Jackie Rocheleau reported in “Learning trick puts nightmares to bed” (SN: 12/3/22, p. 11).
Reader Helen Leaver shared her trick to a good night’s sleep: “I learned that I was having strong unpleasant adventures while sleeping, and I would awaken hot and sweaty. By eliminating the amount of heat from bedding and an electrically heated mattress pad, I now sleep well without those nightmares.” Pest perspectives In “Why do we hate pests?” (SN: 12/3/22, p. 26), Deborah Balthazar interviewed former Science News Explores staff writer Bethany Brookshire about her new book, Pests. The book argues that humans — influenced by culture, class, colonization and much more — create animal villains.
The article prompted reader Doug Clapp to reflect on what he considers pests or weeds. “A weed is a plant in the wrong place, and a pest is an animal in the wrong place,” Clapp wrote. But what’s considered “wrong” depends on the humans who have power over the place, he noted. “Grass in a lawn can be a fine thing. Grass in a garden choking the vegetables I’m trying to grow becomes a weed. Mice in the wild don’t bother me. Field mice migrating into my house when the weather cools become a pest, especially when they eat into my food and leave feces behind,” Clapp wrote.
The article encouraged Clapp to look at pests through a societal lens: “I had never thought of pests in terms of high-class or low-class. Likewise, the residual implications of [colonization]. Thanks for provoking me to consider some of these issues in a broader context.”
Patricia Hidalgo-Gonzalez saw the future of energy on a broiling-hot day last September.
An email alert hit her inbox from the San Diego Gas & Electric Company. “Extreme heat straining the grid,” read the message, which was also pinged as a text to 27 million people. “Save energy to help avoid power interruptions.”
It worked. People cut their energy use. Demand plunged, blackouts were avoided and California successfully weathered a crisis exacerbated by climate change. “It was very exciting to see,” says Hidalgo-Gonzalez, an electrical engineer at the University of California, San Diego who studies renewable energy and the power grid. This kind of collective societal response, in which we reshape how we interact with the systems that provide us energy, will be crucial as we figure out how to live on a changing planet.
Earth has warmed at least 1.1 degrees Celsius since the 19th century, when the burning of coal, oil and other fossil fuels began belching heat-trapping gases such as carbon dioxide into the atmosphere. Scientists agree that only drastic action to cut emissions can keep the planet from blasting past 1.5 degrees of warming — a threshold beyond which the consequences become even more catastrophic than the rising sea levels, extreme weather and other impacts the world is already experiencing.
The goal is to achieve what’s known as net-zero emissions, where any greenhouse gases still entering the atmosphere are balanced by those being removed — and to do it as soon as we can.
Scientists say it is possible to swiftly transform the ways we produce and consume energy. To show the way forward, researchers have set out paths toward a world where human activities generate little to no carbon dioxide and other greenhouse gases — a decarbonized economy.
The key to a decarbonized future lies in producing vast amounts of new electricity from sources that emit little to none of the gases, such as wind, solar and hydropower, and then transforming as much of our lives and our industries as possible to run off those sources. Clean electricity needs to power not only the planet’s current energy use but also the increased demands of a growing global population.
Once humankind has switched nearly entirely to clean electricity, we will also have to counterbalance the carbon dioxide we still emit — yes, we will still emit some — by pulling an equivalent amount of carbon dioxide out of the atmosphere and storing it somewhere permanently.
Achieving net-zero emissions won’t be easy. Getting to effective and meaningful action on climate change requires overcoming decades of inertia and denial about the scope and magnitude of the problem. Nations are falling well short of existing pledges to reduce emissions, and global warming remains on track to charge past 1.5 degrees perhaps even by the end of this decade.
Yet there is hope. The rate of growth in CO2 emissions is slowing globally — down from 3 percent annual growth in the 2000s to half a percent annual growth in the last decade, according to the Global Carbon Project, which quantifies greenhouse gas emissions.
There are signs annual emissions could start shrinking. And over the last two years, the United States, by far the biggest cumulative contributor to global warming, has passed several pieces of federal legislation that include financial incentives to accelerate the transition to clean energy. “We’ve never seen anything at this scale,” says Erin Mayfield, an energy researcher at Dartmouth College.
Though the energy transition will require many new technologies, such as innovative ways to permanently remove carbon from the atmosphere, many of the solutions, such as wind and solar power, are in hand — “stuff we already have,” Mayfield says. The current state of carbon dioxide emissions Of all the emissions that need to be slashed, the most important is carbon dioxide, which comes from many sources such as cars and trucks and coal-burning power plants. The gas accounted for 79 percent of U.S. greenhouse gas emissions in 2020. The next most significant greenhouse gas, at 11 percent of emissions in the United States, is methane, which comes from oil and gas operations as well as livestock, landfills and other land uses.
The amount of methane may seem small, but it is mighty — over the short term, methane is more than 80 times as efficient at trapping heat as carbon dioxide is, and methane’s atmospheric levels have nearly tripled in the last two centuries. Other greenhouse gases include nitrous oxides, which come from sources such as applying fertilizer to crops or burning fuels and account for 7 percent of U.S. emissions, and human-made fluorinated gases such as hydrofluorocarbons that account for 3 percent.
Globally, emissions are dominated by large nations that produce lots of energy. The United States alone emits around 5 billion metric tons of carbon dioxide each year. It is responsible for most of the greenhouse gas emissions throughout history and ceded the spot for top annual emitter to China only in the mid-2000s. India ranks third.
Because of the United States’ role in producing most of the carbon pollution to date, many researchers and advocates argue that it has the moral responsibility to take the global lead on cutting emissions. And the United States has the most ambitious goals of the major emitters, at least on paper. President Joe Biden has said the country is aiming to reach net-zero emissions by 2050. Leaders in China and India have set net-zero goals of 2060 and 2070, respectively.
Under the auspices of a 2015 international climate change treaty known as the Paris agreement, 193 nations plus the European Union have pledged to reduce their emissions. The agreement aims to keep global warming well below 2 degrees, and ideally to 1.5 degrees, above preindustrial levels. But it is insufficient. Even if all countries cut their emissions as much as they have promised under the Paris agreement, the world would likely blow past 2 degrees of warming before the end of this century.
Every nation continues to find its own path forward. “At the end of the day, all the solutions are going to be country-specific,” says Sha Yu, an earth scientist at the Pacific Northwest National Laboratory and University of Maryland’s Joint Global Change Research Institute in College Park, Md. “There’s not a universal fix.”
But there are some common themes for how to accomplish this energy transition — ways to focus our efforts on the things that will matter most. These are efforts that go beyond individual consumer choices such as whether to fly less or eat less meat. They instead penetrate every aspect of how society produces and consumes energy.
Such massive changes will need to overcome a lot of resistance, including from companies that make money off old forms of energy as well as politicians and lobbyists. But if society can make these changes, it will rank as one of humanity’s greatest accomplishments. We will have tackled a problem of our own making and conquered it.
Here’s a look at what we’ll need to do.
Make as much clean electricity as possible To meet the need for energy without putting carbon dioxide into the atmosphere, countries would need to dramatically scale up the amount of clean energy they produce. Fortunately, most of that energy would be generated by technologies we already have — renewable sources of energy including wind and solar power.
“Renewables, far and wide, are the key pillar in any net-zero scenario,” says Mayfield, who worked on an influential 2021 report from Princeton University’s Net-Zero America project, which focused on the U.S. economy.
The Princeton report envisions wind and solar power production roughly quadrupling by 2030 to get the United States to net-zero emissions by 2050. That would mean building many new solar and wind farms, so many that in the most ambitious scenario, wind turbines would cover an area the size of Arkansas, Iowa, Kansas, Missouri, Nebraska and Oklahoma combined. Such a scale-up is only possible because prices to produce renewable energy have plunged. The cost of wind power has dropped nearly 70 percent, and solar power nearly 90 percent, over the last decade in the United States. “That was a game changer that I don’t know if some people were expecting,” Hidalgo-Gonzalez says.
Globally the price drop in renewables has allowed growth to surge; China, for instance, installed a record 55 gigawatts of solar power capacity in 2021, for a total of 306 gigawatts or nearly 13 percent of the nation’s installed capacity to generate electricity. China is almost certain to have had another record year for solar power installations in 2022.
Challenges include figuring out ways to store and transmit all that extra electricity, and finding locations to build wind and solar power installations that are acceptable to local communities. Other types of low-carbon power, such as hydropower and nuclear power, which comes with its own public resistance, will also likely play a role going forward. Get efficient and go electric The drive toward net-zero emissions also requires boosting energy efficiency across industries and electrifying as many aspects of modern life as possible, such as transportation and home heating.
Some industries are already shifting to more efficient methods of production, such as steelmaking in China that incorporates hydrogen-based furnaces that are much cleaner than coal-fired ones, Yu says. In India, simply closing down the most inefficient coal-burning power plants provides the most bang for the buck, says Shayak Sengupta, an energy and policy expert at the Observer Research Foundation America think tank in Washington, D.C. “The list has been made up,” he says, of the plants that should close first, “and that’s been happening.”
To achieve net-zero, the United States would need to increase its share of electric heat pumps, which heat houses much more cleanly than gas- or oil-fired appliances, from around 10 percent in 2020 to as much as 80 percent by 2050, according to the Princeton report. Federal subsidies for these sorts of appliances are rolling out in 2023 as part of the new Inflation Reduction Act, legislation that contains a number of climate-related provisions.
Shifting cars and other vehicles away from burning gasoline to running off of electricity would also lead to significant emissions cuts. In a major 2021 report, the National Academies of Sciences, Engineering and Medicine said that one of the most important moves in decarbonizing the U.S. economy would be having electric vehicles account for half of all new vehicle sales by 2030. That’s not impossible; electric car sales accounted for nearly 6 percent of new sales in the United States in 2022, which is still a low number but nearly double the previous year.
Make clean fuels Some industries such as manufacturing and transportation can’t be fully electrified using current technologies — battery powered airplanes, for instance, will probably never be feasible for long-duration flights. Technologies that still require liquid fuels will need to switch from gas, oil and other fossil fuels to low-carbon or zero-carbon fuels.
One major player will be fuels extracted from plants and other biomass, which take up carbon dioxide as they grow and emit it when they die, making them essentially carbon neutral over their lifetime. To create biofuels, farmers grow crops, and others process the harvest in conversion facilities into fuels such as hydrogen. Hydrogen, in turn, can be substituted for more carbon-intensive substances in various industrial processes such as making plastics and fertilizers — and maybe even as fuel for airplanes someday.
In one of the Princeton team’s scenarios, the U.S. Midwest and Southeast would become peppered with biomass conversion plants by 2050, so that fuels can be processed close to where crops are grown. Many of the biomass feedstocks could potentially grow alongside food crops or replace other, nonfood crops. Cut methane and other non-CO2 emissions Greenhouse gas emissions other than carbon dioxide will also need to be slashed. In the United States, the majority of methane emissions come from livestock, landfills and other agricultural sources, as well as scattered sources such as forest fires and wetlands. But about one-third of U.S. methane emissions come from oil, gas and coal operations. These may be some of the first places that regulators can target for cleanup, especially “super emitters” that can be pinpointed using satellites and other types of remote sensing.
In 2021, the United States and the European Union unveiled what became a global methane pledge endorsed by 150 countries to reduce emissions. There is, however, no enforcement of it yet. And China, the world’s largest methane emitter, has not signed on.
Nitrous oxides could be reduced by improving soil management techniques, and fluorinated gases by finding alternatives and improving production and recycling efforts.
Sop up as much CO2 as possible Once emissions have been cut as much as possible, reaching net-zero will mean removing and storing an equivalent amount of carbon to what society still emits.
One solution already in use is to capture carbon dioxide produced at power plants and other industrial facilities and store it permanently somewhere, such as deep underground. Globally there are around 35 such operations, which collectively draw down around 45 million tons of carbon dioxide annually. About 200 new plants are on the drawing board to be operating by the end of this decade, according to the International Energy Agency.
The Princeton report envisions carbon capture being added to almost every kind of U.S. industrial plant, from cement production to biomass conversion. Much of the carbon dioxide would be liquefied and piped along more than 100,000 kilometers of new pipelines to deep geologic storage, primarily along the Texas Gulf Coast, where underground reservoirs can be used to trap it permanently. This would be a massive infrastructure effort. Building this pipeline network could cost up to $230 billion, including $13 billion for early buy-in from local communities and permitting alone.
Another way to sop up carbon is to get forests and soils to take up more. That could be accomplished by converting crops that are relatively carbon-intensive, such as corn to be used in ethanol, to energy-rich grasses that can be used for more efficient biofuels, or by turning some cropland or pastures back into forest. It’s even possible to sprinkle crushed rock onto croplands, which accelerates natural weathering processes that suck carbon dioxide out of the atmosphere.
Another way to increase the amount of carbon stored in the land is to reduce the amount of the Amazon rainforest that is cut down each year. “For a few countries like Brazil, preventing deforestation will be the first thing you can do,” Yu says.
When it comes to climate change, there’s no time to waste The Princeton team estimates that the United States would need to invest at least an additional $2.5 trillion over the next 10 years for the country to have a shot at achieving net-zero emissions by 2050. Congress has begun ramping up funding with two large pieces of federal legislation it passed in 2021 and 2022. Those steer more than $1 trillion toward modernizing major parts of the nation’s economy over a decade — including investing in the energy transition to help fight climate change.
Between now and 2030, solar and wind power, plus increasing energy efficiency, can deliver about half of the emissions reductions needed for this decade, the International Energy Agency estimates. After that, the primary drivers would need to be increasing electrification, carbon capture and storage, and clean fuels such as hydrogen. The trick is to do all of this without making people’s lives worse. Developing nations need to be able to supply energy for their economies to develop. Communities whose jobs relied on fossil fuels need to have new economic opportunities.
Julia Haggerty, a geographer at Montana State University in Bozeman who studies communities that are dependent on natural resources, says that those who have money and other resources to support the transition will weather the change better than those who are under-resourced now. “At the landscape of states and regions, it just remains incredibly uneven,” she says.
The ongoing energy transition also faces unanticipated shocks such as Russia’s invasion of Ukraine, which sent energy prices soaring in Europe, and the COVID-19 pandemic, which initially slashed global emissions but later saw them rebound.
But the technologies exist for us to wean our lives off fossil fuels. And we have the inventiveness to develop more as needed. Transforming how we produce and use energy, as rapidly as possible, is a tremendous challenge — but one that we can meet head-on. For Mayfield, getting to net-zero by 2050 is a realistic goal for the United States. “I think it’s possible,” she says. “But it doesn’t mean there’s not a lot more work to be done.”
As far back as roughly 25,000 years ago, Ice Age hunter-gatherers may have jotted down markings to communicate information about the behavior of their prey, a new study finds.
These markings include dots, lines and the symbol “Y,” and often accompany images of animals. Over the last 150 years, the mysterious depictions, some dating back nearly 40,000 years, have been found in hundreds of caves across Europe.
Some archaeologists have speculated that the markings might relate to keeping track of time, but the specific purpose has remained elusive (SN: 7/9/19). Now, a statistical analysis, published January 5 in Cambridge Archeological Journal, presents evidence that past people may have been recording the mating and birthing schedule of local fauna. By comparing the marks to the animals’ life cycles, researchers showed that the number of dots or lines in a given image strongly correlates to the month of mating across all the analyzed examples, which included aurochs (an extinct species of wild cattle), bison, horses, mammoth and fish. What’s more, the position of the symbol “Y” in a sequence was predictive of birth month, suggesting that “Y” signifies “to give birth.”
The finding is one of the earliest records of a coherent notational system, the researchers say. It indicates that people at the time were able to interpret the meaning of an item’s position in a sequence and plan ahead for the distant future using a calendar of sorts — reinforcing the suggestion that they were capable of complex cognition. “This is a really big deal cognitively,” says Ben Bacon, an independent researcher based in London. “We’re dealing with a system that has intense organization, intense logic to it.”
A furniture conservator by day, Bacon spent years poring through scientific articles to compile over 800 instances of these cave markings. From his research and reading the literature, he reasoned that the dots corresponded to the 13 lunar cycles in a year. But he thought that the hunter-gatherers would’ve been more concerned with seasonal changes than the moon.
In the new paper, he and colleagues argue that rather than pinning a calendar to astronomical events like the equinox, the hunter-gatherers started their calendar year with the snowmelt in the spring. Not only would the snowmelt be a clear point of origin, but the meteorological calendar would also account for differences in timing across locations. For example, though snowmelt would start on different dates in different latitudes, bison would always mate approximately four lunar cycles — or months — after that region’s snowmelt, as indicated by four dots or lines.
“This is why it’s such a clever system, because it’s based on the universal,” Bacon says. “Which means if you migrate from the Pyrenees to Belgium, you can just use the same calendar.”
He needed data to prove his idea. After compiling the markings, he worked with academic researchers to identify the timing of migration, mating and birth for common Ice Age animals targeted by hunter-gatherers by using archaeological data or comparing with similar modern animals. Next, the researchers determined if the marks aligned significantly with important life events based on this calendar. When the team ran the statistical analysis, the results strongly supported Bacon’s theory.
When explaining the markings, “we’ve argued for notational systems before, but it’s always been fairly speculative as to what the people were counting and why they were counting,” says Brian Hayden, an archaeologist at Simon Fraser University in Burnaby, British Columbia, who peer-reviewed the paper. “This adds a lot more depth and specificity to why people were keeping calendars and how they were using them.”
Linguistic experts argue that, given the lack of conventional syntax and grammar, the marks wouldn’t be considered writing. But that doesn’t make the finding inherently less exciting, says paleoanthropologist Genevieve von Petzinger of the Polytechnic Institute of Tomar in Portugal, who wasn’t involved in the study. Writing systems are often mistakenly considered a pinnacle of achievement, when in fact writing would be developed only in cultural contexts where it’s useful, she says. Instead, it’s significant that the marks provide a way to keep records outside of the mind.
“In a way, that was the huge cognitive leap,” she says. “Suddenly, we have the ability to preserve [information] beyond the moment. We have the ability to transmit it across space and time. Everything starts to change.”
The debate over these marks’ meanings continues. Archaeologist April Nowell doesn’t buy many of the team’s assumptions. “It boggles my mind why one would need a calendar … to predict that animals were going to have offspring in the spring,” says Nowell, of the University of Victoria in British Columbia. “The amount of information that this calendar is providing, if it really is a calendar, is quite minimal.”
Hayden adds that, while the basic pattern would still hold, some of the cave marks had “wiggle room for interpretation.” The next step, he says, will be to review and verify the interpretations of the marks.
Prairie voles have long been heralded as models of monogamy. Now, a study suggests that the “love hormone” once thought essential for their bonding — oxytocin — might not be so necessary after all.
Interest in the romantic lives of prairie voles (Microtus ochrogaster) was first sparked more than 40 years ago, says Devanand Manoli, a biologist at the University of California, San Francisco. Biologists trying to capture voles to study would frequently catch two at a time, because “what they were finding were these male-female pairs,” he says. Unlike many other rodents with their myriad partners, prairie voles, it turned out, mate for life (SN: 10/5/15). Pair-bonded prairie voles prefer each other’s company over a stranger’s and like to huddle together both in the wild and the lab. Because other vole species don’t have social behaviors as complex as prairie voles do, they have been a popular animal system for studying how social behavior evolves.
Research over the last few decades has implicated a few hormones in the brain as vital for proper vole manners, most notably oxytocin, which is also important for social behavior in humans and other animals.
Manoli and colleagues thought the oxytocin receptor, the protein that detects and reacts to oxytocin, would be the perfect test target for a new genetic engineering method based on CRISPR technology, which uses molecules from bacteria to selectively turn off genes. The researchers used the technique on vole embryos to create animals born without functioning oxytocin receptors. The team figured that the rodents wouldn’t be able to form pair-bonds — just like voles in past experiments whose oxytocin activity was blocked with drugs.
Instead, Manoli says, the researchers got “a big surprise.” The voles could form pair-bonds even without oxytocin, the team reports in the March 15 Neuron.
“I was very surprised by their results,” says Larry Young, a biologist at Emory University in Atlanta, who was not involved with the study but has studied oxytocin in prairie voles for decades.
A key difference between the new study and past studies that used drugs to block oxytocin is the timing of exactly when the hormone’s activity is turned off. With drugs, the voles are adults and have had exposure to oxytocin in their brains before the shutoff. With CRISPR, “these animals are born never experiencing oxytocin signaling in the brain,” says Young, whose research group has recently replicated Manoli’s experiment and found the same result.
It may be, Young says, that pair-bonding is controlled by a brain circuit that typically becomes dependent on oxytocin through exposure to it during development, like a symphony trained by a conductor. Suddenly remove that conductor and the symphony will sound discordant, whereas a jazz band that’s never practiced with a conductor fares just fine without one. Manoli agrees that the technique’s timing matters. A secondary reason for the disparity, he says, could be that drugs often have off-target effects, such that the chemicals meant to block oxytocin could have been doing other things in the voles’ brains to affect pair-bonding. But Young disagrees. “I don’t believe that,” he says. “The [drug] that people use is very selective,” not even binding to the receptor of oxytocin’s closest molecular relative, vasopressin.
Does this result mean that decades of past work on pair-bonding has been upended? Not quite.
“It shows us that this is a much more complicated question,” Manoli says. “The pharmacologic manipulations … suggested that [oxytocin] plays a critical role. The question is, what is that role?”
The new seemingly startling result makes sense if you look at the big picture, Manoli says. The ability for voles to pair-bond is “so critical for the survival of the species,” he says. “From a genetics perspective, it may make sense that there isn’t a single point of failure.”
The group now hopes to look at how other hormones, like vasopressin, influence pair-bonding using this relatively new genetic technique. They are also looking more closely at the voles’ behavior to be sure that the CRISPR gene editing didn’t alter it in a way they haven’t noticed yet.
In the game of vole “love,” it looks like we’re still trying to understand all the players.
Penicillin, effective against many bacterial infections, is often a first-line antibiotic. Yet it is also one of the most common causes of drug allergies. Around 10 percent of people say they’ve had an allergic reaction to penicillin, according to the U.S. Centers for Disease Control and Prevention.
Now researchers have found a genetic link to the hypersensitivity, which, while rarely fatal, can cause hives, wheezing, arrythmias and more.
People who report penicillin allergies can have a genetic variation on an immune system gene that helps the body distinguish between our own cells and harmful bacteria and viruses. That hot spot is on the major histocompatibility complex gene HLA-B, said Kristi Krebs, a pharmacogenomics researcher for the Estonian Genome Center at the University of Tartu. She presented the finding October 26 at the American Society of Human Genetics 2020 virtual meeting. The research was also published online October 1 in the American Journal of Human Genetics.
Several recent studies have connected distinct differences in HLA genes to bad reactions to specific drugs. For example, studies have linked an HLA-B variant to adverse reactions to an HIV/AIDS medication called abacavir, and they’ve linked a different HLA-B variant to allergic reactions to the gout medicine allopurinol. “So it’s understandable that this group of HLA variants can predispose us to higher risk of allergic drug reactions,” says Bernardo Sousa-Pinto, a researcher in drug allergies and evidence synthesis at the University of Porto in Portugal, who was not involved in the study.
For the penicillin study, the team hunted through more than 600,000 electronic health records that included genetic information for people who self-reported penicillin allergies. The researchers used several genetic search tools, which comb through DNA in search of genetic variations that may be linked to a health problem. Their search turned up a specific spot on chromosome 6, on a variant called HLA-B*55:01.
The group then checked its results against 1.12 million people of European ancestry in the research database of the genetic-testing company 23andMe and found the same link. A check of smaller databases including people with East Asian, Middle Eastern and African ancestries found no similar connection, although those sample sizes were too small to be sure, Krebs said
It’s too soon to tell if additional studies will “lead to better understanding of penicillin allergy and also better prediction,” she said.
Penicillin allergies often begin in childhood, but can wane over time, making the drugs safer to use some years later, Sousa-Pinto says. In this study, self-reported allergies were not confirmed with a test, so there’s a chance that some participants were misclassified. This is very common, Sousa-Pinto says. “It would be interesting to replicate this study in … participants with confirmed penicillin allergy.”
The distinction matters, because about 90 percent of patients who claim to be allergic to penicillin can actually safely take the drug (SN: 12/11/16). Yet, Sousa-Pinto says, those people may be given a more-expensive antibiotic that may not work as well. Less-effective antibiotics can make patients more prone to infections with bacteria that are resistant to the drugs. “This … is something that has a real impact on health care and on health services,” he says.
The fate of a potential new Alzheimer’s drug is still uncertain. Evidence that the drug works isn’t convincing enough for it to be approved, outside experts told the U.S. Food and Drug Administration during a Nov. 6 virtual meeting that at times became contentious.
The scientists and clinicians were convened at the request of the FDA to review the evidence for aducanumab, a drug that targets a protein called amyloid-beta that accumulates in the brains of people with Alzheimer’s. The drug is designed to stick to A-beta and stop it from forming larger, more dangerous clumps. That could slow the disease’s progression but not stop or reverse it.
When asked whether a key clinical study provided strong evidence that the drug effectively treated Alzheimer’s, eight of 11 experts voted no. One expert voted yes, and two were uncertain.
The FDA is not bound to follow the recommendations of the guidance committee, though it has historically done so. If ultimately approved, the drug would be a milestone, says neurologist and neuroscientist Arjun Masurkar of New York University Langone’s Alzheimer’s Disease Research Center. Aducanumab “would be the first therapy that actually targets the underlying disease itself and slows progression.”
Developed by the pharmaceutical company Biogen, which is based in Cambridge, Mass., the drug is controversial. That’s because two large clinical trials of aducanumab have yielded different outcomes, one positive and one negative (SN: 12/5/19). The trials were also paused at one point, based on analyses that suggested the drug didn’t work.
Those unusual circumstances created gaps in the evidence, leaving big questions in some scientists’ minds about whether the drug is effective. Aducanumab’s ability to treat Alzheimer’s “cannot be proven by clinical trials with divergent outcomes,” researchers wrote in a perspective article published November 1 in Alzheimer’s & Dementia. The drug should be tested again with a different clinical trial, those researchers say.
But other groups, including the Alzheimer’s Association, are rooting for the drug. In a letter sent to the FDA on October 23, the nonprofit health organization urged aducanumab’s approval, along with longer-term studies of the drug.
“While the trial data has led to some uncertainty among the scientific community, this must be weighed against the certainty of what this disease will do to millions of Americans absent a treatment,” Joanne Pike, chief strategy officer of the Alzheimer’s Association, wrote in the letter. She noted that by 2050, more than 13 million Americans 65 and older may have Alzheimer’s. More than 5 million Americans currently have the disease.
Even with an eventual approval, questions would remain for patients and their caregivers, says Zaldy Tan, a geriatric memory specialist at Cedars-Sinai Medical Center in Los Angeles. “Cost and logistics are going to be complex issues to tackle,” he says. One estimate puts aducanumab’s price tag at $40,000 annually, and treatment would require injections, for instance, which would require regular visits to a health care facility.
Bacteria go to extremes to handle hard times: They hunker down, building a fortress-like shell around their DNA and turning off all signs of life. And yet, when times improve, these dormant spores can rise from the seeming dead.
But “you gotta be careful when you decide to come back to life,” says Peter Setlow, a biochemist at UConn Health in Farmington. “Because if you get it wrong, you die.” How is a spore to tell?
For spores of the bacterium Bacillus subtilis, the solution is simple: It counts.
These “living rocks” sense it’s time to revive, or germinate, by essentially counting how often they encounter nutrients, researchers report in a new study in the Oct. 7 Science. “They appear to have literally no measurable biological activity,” says Gürol Süel, a microbiologist at the University of California, San Diego. But Süel and his colleagues knew that spores’ cores contain positively charged potassium atoms, and because these atoms can move around without the cell using energy, the team suspected that potassium could be involved in shocking the cells awake.
So the team exposed B. subtilis spores to nutrients and used colorful dyes to track the movement of potassium out of the core. With each exposure, more potassium left the core, shifting its electrical charge to be more negative. Once the spores’ cores were negatively charged enough, germination was triggered, like a champagne bottle finally popping its cork. The number of exposures it took to trigger germination varied by spore, just like some corks require more or less twisting to pop. Spores whose potassium movement was hamstrung showed limited change in electric charge and were less likely to “pop” back to life no matter how many nutrients they were exposed to, the team’s experiments showed.
Changes in the electrical charge of a cell are important across the tree of life, from determining when brain cells zip off messages to each other, to the snapping of a Venus flytrap (SN: 10/14/20). Finding that spores also use electrical charges to set their wake-up calls excites Süel. “You want to find principles in biology,” he says, “processes that cross systems, that cross fields and boundaries.”
Spores are not only interesting for their unique and extreme biology, but also for practical applications. Some “can cause some rather nasty things” from food poisoning to anthrax, says Setlow, who was not involved in the study. Since spores are resistant to most antibiotics, understanding germination could lead to a way to bring them back to life in order to kill them for good.
Still, there are many unanswered questions about the “black box” of how spores start germination, like whether it’s possible for the spores to “reset” their potassium count. “We really are in the beginnings of trying to fill in that black box,” says Kaito Kikuchi, a biologist now at Reveal Biosciences in San Diego who conducted the work while at University of California, San Diego. But discovering how spores manage to track their environment while more dead than alive is an exciting start.
Giving revamped silkworm silk a metallic bath may make the strands both strong and stiff, scientists report October 6 in Matter. Some strands were up to 70 percent stronger than silk spun by spiders, the team found.
The work is the latest in a decades-long quest to create fibers as strong, lightweight and biodegradable as spider silk. If scientists could mass-produce such material, the potential uses range from the biomedical to the athletic. Sutures, artificial ligaments and tendons — even sporting equipment could get an arachnid enhancement. “If you’ve got a climbing rope that weighs half of what it normally does and still has the same mechanical properties, then obviously you’re going to be a happy climber,” says Randy Lewis, a silk scientist at Utah State University in Logan who was not involved with the study.
Scrounging up enough silky material to make these super strong products has been a big hurdle. Silk from silkworms is simple to harvest, but not all that strong. And spider silk, the gold-standard for handspun strength and toughness, is not exactly easy to collect. “Unlike silkworms, spiders cannot be farmed due to their territorial and aggressive nature,” write study coauthor Zhi Lin, a structural biologist at Tianjin University in China, and colleagues.
Scientists around the world have tried to spin sturdy strands in the lab using silkworm cocoons as a starting point. The first step is to strip off the silk’s gummy outer coating. Scientists can do this by boiling the fibers in a chemical bath, but that can be like taking a hatchet to silk proteins. If the proteins get too damaged, it’s hard for scientists to respin them into high-quality strands, says Chris Holland, a materials scientist at the University of Sheffield in England who was not involved in the study.
Lin’s team tried gentler approaches, one of which used lower temperatures and a papaya enzyme, to help dissolve the silk’s coating. That mild-mannered method seemed to work. “They don’t have little itty-bitty pieces of silk protein,” Lewis says. “That’s huge because the bigger the proteins that remain, the stronger the fibers are going to be.” After some processing steps, the researchers forced the resulting silk sludge through a tiny tube, like squeezing out toothpaste. Then, they bathed the extruded silk in a solution containing zinc and iron ions, eventually stretching the strands like taffy to make long, skinny fibers. The metal dip could be why some of the strands were so strong — Lin’s team detected zinc ions in the finished fibers. But Holland and Lewis aren’t so sure.
The team’s real innovation may be that “they’ve managed to unspin silk in a less damaging way,” Holland says. Lewis agrees. “In my mind,” he says, “that’s a major step forward.”
Humankind is seeing Neptune’s rings in a whole new light thanks to the James Webb Space Telescope.
In an infrared image released September 21, Neptune and its gossamer diadems of dust take on an ethereal glow against the inky backdrop of space. The stunning portrait is a huge improvement over the rings’ previous close-up, which was taken more than 30 years ago.
Unlike the dazzling belts encircling Saturn, Neptune’s rings appear dark and faint in visible light, making them difficult to see from Earth. The last time anyone saw Neptune’s rings was in 1989, when NASA’s Voyager 2 spacecraft, after tearing past the planet, snapped a couple grainy photos from roughly 1 million kilometers away (SN: 8/7/17). In those photos, taken in visible light, the rings appear as thin, concentric arcs.
As Voyager 2 continued to interplanetary space, Neptune’s rings once again went into hiding — until July. That’s when the James Webb Space Telescope, or JWST, turned its sharp, infrared gaze toward the planet from roughly 4.4 billion kilometers away (SN: 7/11/22). Neptune itself appears mostly dark in the new image. That’s because methane gas in the planet’s atmosphere absorbs much of its infrared light. A few bright patches mark where high-altitude methane ice clouds reflect sunlight.
And then there are the ever-elusive rings. “The rings have lots of ice and dust in them, which are extremely reflective in infrared light,” says Stefanie Milam, a planetary scientist at NASA’s Goddard Space Flight Center in Greenbelt, Md., and one of JWST’s project scientists. The enormity of the telescope’s mirror also makes its images extra sharp. “JWST was designed to look at the first stars and galaxies across the universe, so we can really see fine details that we haven’t been able to see before,” Milam says.
Upcoming JWST observations will look at Neptune with other scientific instruments. That should provide new intel on the rings’ composition and dynamics, as well as on how Neptune’s clouds and storms evolve, Milam says. “There’s more to come.”