With the blessings of all 14 families of lost astronauts, a new memorial to the Challenger and Columbia space shuttle disasters opened in June at the Kennedy Space Center in Florida. The permanent exhibit includes the first pieces of shuttle wreckage ever on public display, but fittingly focuses more on the lives lost.
“Forever Remembered” is housed inside the space center’s new $100 million exhibit about the space shuttle Atlantis. Below the nose of the intact shuttle, visitors enter a hall lit by tributes to each astronaut from the lost missions, those from Challenger on the left and Columbia on the right. Each display includes glimpses of the astronaut’s life. Items include plans for remodeling the home of Challenger pilot Michael Smith and a recovered page in Hebrew from the Columbia flight journal of Ilan Ramon, a payload specialist and the first Israeli astronaut. Past the hall, visitors enter a small gallery with a single piece of each shuttle: a body panel from Challenger (shown at left) and cockpit window frames from Columbia . There are no extended written descriptions or flashy videos. In short, it’s a place for pondering rather than learning. As a ninth-grader in school 50 miles away when Challenger exploded in 1986 and as an adult who waited for a telltale sonic boom that never came when Columbia was lost during re-entry in 2003, I found the effect powerful. The exhibit’s exit hallway reveals the tragedies from multiple perspectives on video displays. One video details the massive efforts to recover the wreckage and remains from the disasters, from the ocean for Challenger and from land for Columbia. Others focus on the emotional tolls and the critical shuttle launches that followed each completed investigation.
Michael Curie, Kennedy Space Center’s news chief, says family members have been both supportive and grateful for the exhibit. “They feel that it humanizes their family members in a way that never has been done before,” he says. Indeed, “Forever Remembered” is an effective reminder of the very real risks each astronaut willingly and bravely faced.
A female giraffe has a great Valentine’s Day gift for potential mates: urine.
Distinctive anatomy helps male giraffes get a taste for whether a female is ready to mate, animal behaviorists Lynette and Benjamin Hart report January 19 in Animals. A pheromone-detecting organ in giraffes has a stronger connection to the mouth than the nose, the researchers found. That’s why males scope out which females to mate with by sticking their tongues in a urine stream. Animals such as male gazelles will lick fresh urine on the ground to track if females are ready to mate. But giraffes’ long necks and heavy heads make bending over to investigate urine on the ground an unstable and vulnerable position, says Lynette Hart, of the University of California, Davis.
The researchers observed giraffes (Giraffa giraffa angolensis) in Etosha National Park in Namibia in 1994, 2002 and 2004. Bull giraffes nudged or kicked the female to ask her to pee. If she was a willing participant, she urinated for a few seconds, while the male took a sip. Then the male curled his lip and inhaled with his mouth, a behavior called a flehmen response, to pull the female’s scent into two openings on the roof of the mouth. From the mouth, the scent travels to the vomeronasal organ, or VNO, which detects pheromones.
The Harts say they never saw a giraffe investigate urine on the ground.
Unlike many other mammals, giraffes have a stronger oral connection — via a duct — to the VNO, than a nasal one, examinations of preserved giraffe specimens showed. One possible explanation for the difference could be that a VNO-nose link helps animals that breed at specific times of the year detect seasonal plants, says Benjamin Hart, a veterinarian also at the University of California, Davis. But giraffes can mate any time of year, so the nasal connection may not matter as much.
Animals cover themselves in all kinds of unsavory fluids to keep cool. Humans sweat, kangaroos spit and some birds will urinate on themselves to survive hot days. It turns out that echidnas do something much cuter — though perhaps just as sticky (and slightly icky) — to beat the heat.
The spiny insectivores stay cool by blowing snot bubbles, researchers report January 18 in Biology Letters. The bubbles pop, keeping the critters’ noses moist. As it evaporates, this moisture draws heat away from a blood-filled sinus in the echidna’s beak, helping to cool the animal’s blood. Short-beaked echidnas (Tachyglossus aculeatus) look a bit like hedgehogs but are really monotremes — egg-laying mammals unique to Australia and New Guinea (SN: 11/18/16). Previous lab studies showed that temperatures above 35° Celsius (95° Fahrenheit) should kill echidnas. But echidnas don’t seem to have gotten the memo. They live everywhere from tropical rainforests to deserts to snow-capped peaks, leaving scientists with a physiological puzzle.
Mammals evaporate water to keep cool when temperatures climb above their body temperatures, says environmental physiologist Christine Cooper of Curtin University in Perth, Australia. “Lots of mammals do that by either licking, sweating or panting,” she says. “Echidnas weren’t believed to be able to do that.” But it’s known that the critters blow snot bubbles when it gets hot.
So, armed with a heat-vision camera and a telephoto lens, Cooper and environmental physiologist Philip Withers of the University of Western Australia in Perth drove through nature reserves in Western Australia once a month for a year to film echidnas.
In infrared, the warmest parts of the echidnas’ spiny bodies glowed in oranges, yellows and whites. But the video revealed that the tips of their noses were dark purple blobs, kept cool as moisture from their snot bubbles evaporated. Echidnas might also lose heat through their bellies and legs, the researchers report, while their spikes could act as an insulator. “Finding a way of doing this work in the field is pretty exciting,” says physiological ecologist Stewart Nicol of the University of Tasmania in Hobart, Australia, who was not involved in the study. “You can understand animals and see how they’re responding to their normal environment.” The next step, he says, is to quantify how much heat echidnas really lose through their noses and other body parts.
Monotremes parted evolutionary ways with other mammals between 250 million and 160 million years ago as the supercontinent Pangaea broke apart (SN: 3/8/15). So “they have a whole lot of traits that are considered to be primitive,” Cooper says. “Understanding how they might thermoregulate can give us some ideas about how thermal regulation … might have evolved in mammals.”
More than a century ago, scientists proved that carbon dioxide in Earth’s atmosphere could act like a thermostat — adding more CO2 would turn up the heat, removing it would chill the planet. But back then, most scientists thought that Earth’s climate system was far too large and stable to change quickly, that any fluctuations would happen over such a long timescale that it wouldn’t matter much to everyday life (SN: 3/12/22, p. 16).
Now all it takes is a look at the Weather Channel to know how wrong scientists were. Things are changing fast. Last year alone, Europe, South Asia, China, Japan and the American West endured deadly, record-breaking heat waves (SN: 12/17/22 & 12/31/22, p. 38). As I write this, torrential rains are bringing death and destruction to California. And with levels of climate-warming gases continuing to increase in the atmosphere, extreme weather events will become even more frequent. Given the vastness of this threat, it’s tempting to think that any efforts that we make against it will be futile. But that’s not true. Around the world, scientists and engineers; entrepreneurs and large corporations; state, national and local governments; and international coalitions are acting to put the brakes on climate change. Last year, the United States signed into law a $369 billion investment in renewable energy technologies and other responses (SN: 12/17/22 & 12/31/22, p. 28). And the World Bank invested $31.7 billion to assist other countries.
In this issue, contributing correspondent Alexandra Witze details the paths forward: which responses will help the most, and which remain challenging. Shifting to renewable energy sources like wind and solar should be the easiest. We already have the technology, and costs have plunged over the last decade. Other approaches that are feasible but not as far along include making industrial processes more energy efficient, trapping greenhouse gases and developing clean fuels. Ultimately, the goal is to reinvent the global energy infrastructure. Societies have been retooling energy infrastructures for centuries, from water and steam power to petroleum and natural gas to nuclear power and now renewables. This next transformation will be the biggest yet. But we have the scientific understanding and technological savvy to make it happen.
This cover story kicks off a new series for Science News, The Climate Fix. In future issues, we will focus on covering solutions to the climate crisis, including the science behind innovations, the people making them happen, and the social and environmental impacts. You’ll also see expanded climate coverage for our younger readers, ages 9 and up, at Science News Explores online and in print.
With this issue, we also welcome our new publisher, Michael Gordon Voss. He comes to us with deep knowledge of the media industry, experience in both for-profit and nonprofit publishing and a love of science. Before joining Science News Media Group, Voss was publisher of Stanford Social Innovation Review, and vice president and associate publisher at Scientific American. With his arrival, publisher Maya Ajmera takes on her new role as executive publisher. Under her leadership, we have seen unprecedented growth. We’re fortunate to have these two visionaries directing our business strategy amid a rapidly changing media environment.
Patricia Hidalgo-Gonzalez saw the future of energy on a broiling-hot day last September.
An email alert hit her inbox from the San Diego Gas & Electric Company. “Extreme heat straining the grid,” read the message, which was also pinged as a text to 27 million people. “Save energy to help avoid power interruptions.”
It worked. People cut their energy use. Demand plunged, blackouts were avoided and California successfully weathered a crisis exacerbated by climate change. “It was very exciting to see,” says Hidalgo-Gonzalez, an electrical engineer at the University of California, San Diego who studies renewable energy and the power grid. This kind of collective societal response, in which we reshape how we interact with the systems that provide us energy, will be crucial as we figure out how to live on a changing planet.
Earth has warmed at least 1.1 degrees Celsius since the 19th century, when the burning of coal, oil and other fossil fuels began belching heat-trapping gases such as carbon dioxide into the atmosphere. Scientists agree that only drastic action to cut emissions can keep the planet from blasting past 1.5 degrees of warming — a threshold beyond which the consequences become even more catastrophic than the rising sea levels, extreme weather and other impacts the world is already experiencing.
The goal is to achieve what’s known as net-zero emissions, where any greenhouse gases still entering the atmosphere are balanced by those being removed — and to do it as soon as we can.
Scientists say it is possible to swiftly transform the ways we produce and consume energy. To show the way forward, researchers have set out paths toward a world where human activities generate little to no carbon dioxide and other greenhouse gases — a decarbonized economy.
The key to a decarbonized future lies in producing vast amounts of new electricity from sources that emit little to none of the gases, such as wind, solar and hydropower, and then transforming as much of our lives and our industries as possible to run off those sources. Clean electricity needs to power not only the planet’s current energy use but also the increased demands of a growing global population.
Once humankind has switched nearly entirely to clean electricity, we will also have to counterbalance the carbon dioxide we still emit — yes, we will still emit some — by pulling an equivalent amount of carbon dioxide out of the atmosphere and storing it somewhere permanently.
Achieving net-zero emissions won’t be easy. Getting to effective and meaningful action on climate change requires overcoming decades of inertia and denial about the scope and magnitude of the problem. Nations are falling well short of existing pledges to reduce emissions, and global warming remains on track to charge past 1.5 degrees perhaps even by the end of this decade.
Yet there is hope. The rate of growth in CO2 emissions is slowing globally — down from 3 percent annual growth in the 2000s to half a percent annual growth in the last decade, according to the Global Carbon Project, which quantifies greenhouse gas emissions.
There are signs annual emissions could start shrinking. And over the last two years, the United States, by far the biggest cumulative contributor to global warming, has passed several pieces of federal legislation that include financial incentives to accelerate the transition to clean energy. “We’ve never seen anything at this scale,” says Erin Mayfield, an energy researcher at Dartmouth College.
Though the energy transition will require many new technologies, such as innovative ways to permanently remove carbon from the atmosphere, many of the solutions, such as wind and solar power, are in hand — “stuff we already have,” Mayfield says. The current state of carbon dioxide emissions Of all the emissions that need to be slashed, the most important is carbon dioxide, which comes from many sources such as cars and trucks and coal-burning power plants. The gas accounted for 79 percent of U.S. greenhouse gas emissions in 2020. The next most significant greenhouse gas, at 11 percent of emissions in the United States, is methane, which comes from oil and gas operations as well as livestock, landfills and other land uses.
The amount of methane may seem small, but it is mighty — over the short term, methane is more than 80 times as efficient at trapping heat as carbon dioxide is, and methane’s atmospheric levels have nearly tripled in the last two centuries. Other greenhouse gases include nitrous oxides, which come from sources such as applying fertilizer to crops or burning fuels and account for 7 percent of U.S. emissions, and human-made fluorinated gases such as hydrofluorocarbons that account for 3 percent.
Globally, emissions are dominated by large nations that produce lots of energy. The United States alone emits around 5 billion metric tons of carbon dioxide each year. It is responsible for most of the greenhouse gas emissions throughout history and ceded the spot for top annual emitter to China only in the mid-2000s. India ranks third.
Because of the United States’ role in producing most of the carbon pollution to date, many researchers and advocates argue that it has the moral responsibility to take the global lead on cutting emissions. And the United States has the most ambitious goals of the major emitters, at least on paper. President Joe Biden has said the country is aiming to reach net-zero emissions by 2050. Leaders in China and India have set net-zero goals of 2060 and 2070, respectively.
Under the auspices of a 2015 international climate change treaty known as the Paris agreement, 193 nations plus the European Union have pledged to reduce their emissions. The agreement aims to keep global warming well below 2 degrees, and ideally to 1.5 degrees, above preindustrial levels. But it is insufficient. Even if all countries cut their emissions as much as they have promised under the Paris agreement, the world would likely blow past 2 degrees of warming before the end of this century.
Every nation continues to find its own path forward. “At the end of the day, all the solutions are going to be country-specific,” says Sha Yu, an earth scientist at the Pacific Northwest National Laboratory and University of Maryland’s Joint Global Change Research Institute in College Park, Md. “There’s not a universal fix.”
But there are some common themes for how to accomplish this energy transition — ways to focus our efforts on the things that will matter most. These are efforts that go beyond individual consumer choices such as whether to fly less or eat less meat. They instead penetrate every aspect of how society produces and consumes energy.
Such massive changes will need to overcome a lot of resistance, including from companies that make money off old forms of energy as well as politicians and lobbyists. But if society can make these changes, it will rank as one of humanity’s greatest accomplishments. We will have tackled a problem of our own making and conquered it.
Here’s a look at what we’ll need to do.
Make as much clean electricity as possible To meet the need for energy without putting carbon dioxide into the atmosphere, countries would need to dramatically scale up the amount of clean energy they produce. Fortunately, most of that energy would be generated by technologies we already have — renewable sources of energy including wind and solar power.
“Renewables, far and wide, are the key pillar in any net-zero scenario,” says Mayfield, who worked on an influential 2021 report from Princeton University’s Net-Zero America project, which focused on the U.S. economy.
The Princeton report envisions wind and solar power production roughly quadrupling by 2030 to get the United States to net-zero emissions by 2050. That would mean building many new solar and wind farms, so many that in the most ambitious scenario, wind turbines would cover an area the size of Arkansas, Iowa, Kansas, Missouri, Nebraska and Oklahoma combined. Such a scale-up is only possible because prices to produce renewable energy have plunged. The cost of wind power has dropped nearly 70 percent, and solar power nearly 90 percent, over the last decade in the United States. “That was a game changer that I don’t know if some people were expecting,” Hidalgo-Gonzalez says.
Globally the price drop in renewables has allowed growth to surge; China, for instance, installed a record 55 gigawatts of solar power capacity in 2021, for a total of 306 gigawatts or nearly 13 percent of the nation’s installed capacity to generate electricity. China is almost certain to have had another record year for solar power installations in 2022.
Challenges include figuring out ways to store and transmit all that extra electricity, and finding locations to build wind and solar power installations that are acceptable to local communities. Other types of low-carbon power, such as hydropower and nuclear power, which comes with its own public resistance, will also likely play a role going forward. Get efficient and go electric The drive toward net-zero emissions also requires boosting energy efficiency across industries and electrifying as many aspects of modern life as possible, such as transportation and home heating.
Some industries are already shifting to more efficient methods of production, such as steelmaking in China that incorporates hydrogen-based furnaces that are much cleaner than coal-fired ones, Yu says. In India, simply closing down the most inefficient coal-burning power plants provides the most bang for the buck, says Shayak Sengupta, an energy and policy expert at the Observer Research Foundation America think tank in Washington, D.C. “The list has been made up,” he says, of the plants that should close first, “and that’s been happening.”
To achieve net-zero, the United States would need to increase its share of electric heat pumps, which heat houses much more cleanly than gas- or oil-fired appliances, from around 10 percent in 2020 to as much as 80 percent by 2050, according to the Princeton report. Federal subsidies for these sorts of appliances are rolling out in 2023 as part of the new Inflation Reduction Act, legislation that contains a number of climate-related provisions.
Shifting cars and other vehicles away from burning gasoline to running off of electricity would also lead to significant emissions cuts. In a major 2021 report, the National Academies of Sciences, Engineering and Medicine said that one of the most important moves in decarbonizing the U.S. economy would be having electric vehicles account for half of all new vehicle sales by 2030. That’s not impossible; electric car sales accounted for nearly 6 percent of new sales in the United States in 2022, which is still a low number but nearly double the previous year.
Make clean fuels Some industries such as manufacturing and transportation can’t be fully electrified using current technologies — battery powered airplanes, for instance, will probably never be feasible for long-duration flights. Technologies that still require liquid fuels will need to switch from gas, oil and other fossil fuels to low-carbon or zero-carbon fuels.
One major player will be fuels extracted from plants and other biomass, which take up carbon dioxide as they grow and emit it when they die, making them essentially carbon neutral over their lifetime. To create biofuels, farmers grow crops, and others process the harvest in conversion facilities into fuels such as hydrogen. Hydrogen, in turn, can be substituted for more carbon-intensive substances in various industrial processes such as making plastics and fertilizers — and maybe even as fuel for airplanes someday.
In one of the Princeton team’s scenarios, the U.S. Midwest and Southeast would become peppered with biomass conversion plants by 2050, so that fuels can be processed close to where crops are grown. Many of the biomass feedstocks could potentially grow alongside food crops or replace other, nonfood crops. Cut methane and other non-CO2 emissions Greenhouse gas emissions other than carbon dioxide will also need to be slashed. In the United States, the majority of methane emissions come from livestock, landfills and other agricultural sources, as well as scattered sources such as forest fires and wetlands. But about one-third of U.S. methane emissions come from oil, gas and coal operations. These may be some of the first places that regulators can target for cleanup, especially “super emitters” that can be pinpointed using satellites and other types of remote sensing.
In 2021, the United States and the European Union unveiled what became a global methane pledge endorsed by 150 countries to reduce emissions. There is, however, no enforcement of it yet. And China, the world’s largest methane emitter, has not signed on.
Nitrous oxides could be reduced by improving soil management techniques, and fluorinated gases by finding alternatives and improving production and recycling efforts.
Sop up as much CO2 as possible Once emissions have been cut as much as possible, reaching net-zero will mean removing and storing an equivalent amount of carbon to what society still emits.
One solution already in use is to capture carbon dioxide produced at power plants and other industrial facilities and store it permanently somewhere, such as deep underground. Globally there are around 35 such operations, which collectively draw down around 45 million tons of carbon dioxide annually. About 200 new plants are on the drawing board to be operating by the end of this decade, according to the International Energy Agency.
The Princeton report envisions carbon capture being added to almost every kind of U.S. industrial plant, from cement production to biomass conversion. Much of the carbon dioxide would be liquefied and piped along more than 100,000 kilometers of new pipelines to deep geologic storage, primarily along the Texas Gulf Coast, where underground reservoirs can be used to trap it permanently. This would be a massive infrastructure effort. Building this pipeline network could cost up to $230 billion, including $13 billion for early buy-in from local communities and permitting alone.
Another way to sop up carbon is to get forests and soils to take up more. That could be accomplished by converting crops that are relatively carbon-intensive, such as corn to be used in ethanol, to energy-rich grasses that can be used for more efficient biofuels, or by turning some cropland or pastures back into forest. It’s even possible to sprinkle crushed rock onto croplands, which accelerates natural weathering processes that suck carbon dioxide out of the atmosphere.
Another way to increase the amount of carbon stored in the land is to reduce the amount of the Amazon rainforest that is cut down each year. “For a few countries like Brazil, preventing deforestation will be the first thing you can do,” Yu says.
When it comes to climate change, there’s no time to waste The Princeton team estimates that the United States would need to invest at least an additional $2.5 trillion over the next 10 years for the country to have a shot at achieving net-zero emissions by 2050. Congress has begun ramping up funding with two large pieces of federal legislation it passed in 2021 and 2022. Those steer more than $1 trillion toward modernizing major parts of the nation’s economy over a decade — including investing in the energy transition to help fight climate change.
Between now and 2030, solar and wind power, plus increasing energy efficiency, can deliver about half of the emissions reductions needed for this decade, the International Energy Agency estimates. After that, the primary drivers would need to be increasing electrification, carbon capture and storage, and clean fuels such as hydrogen. The trick is to do all of this without making people’s lives worse. Developing nations need to be able to supply energy for their economies to develop. Communities whose jobs relied on fossil fuels need to have new economic opportunities.
Julia Haggerty, a geographer at Montana State University in Bozeman who studies communities that are dependent on natural resources, says that those who have money and other resources to support the transition will weather the change better than those who are under-resourced now. “At the landscape of states and regions, it just remains incredibly uneven,” she says.
The ongoing energy transition also faces unanticipated shocks such as Russia’s invasion of Ukraine, which sent energy prices soaring in Europe, and the COVID-19 pandemic, which initially slashed global emissions but later saw them rebound.
But the technologies exist for us to wean our lives off fossil fuels. And we have the inventiveness to develop more as needed. Transforming how we produce and use energy, as rapidly as possible, is a tremendous challenge — but one that we can meet head-on. For Mayfield, getting to net-zero by 2050 is a realistic goal for the United States. “I think it’s possible,” she says. “But it doesn’t mean there’s not a lot more work to be done.”
Humankind is seeing Neptune’s rings in a whole new light thanks to the James Webb Space Telescope.
In an infrared image released September 21, Neptune and its gossamer diadems of dust take on an ethereal glow against the inky backdrop of space. The stunning portrait is a huge improvement over the rings’ previous close-up, which was taken more than 30 years ago.
Unlike the dazzling belts encircling Saturn, Neptune’s rings appear dark and faint in visible light, making them difficult to see from Earth. The last time anyone saw Neptune’s rings was in 1989, when NASA’s Voyager 2 spacecraft, after tearing past the planet, snapped a couple grainy photos from roughly 1 million kilometers away (SN: 8/7/17). In those photos, taken in visible light, the rings appear as thin, concentric arcs.
As Voyager 2 continued to interplanetary space, Neptune’s rings once again went into hiding — until July. That’s when the James Webb Space Telescope, or JWST, turned its sharp, infrared gaze toward the planet from roughly 4.4 billion kilometers away (SN: 7/11/22). Neptune itself appears mostly dark in the new image. That’s because methane gas in the planet’s atmosphere absorbs much of its infrared light. A few bright patches mark where high-altitude methane ice clouds reflect sunlight.
And then there are the ever-elusive rings. “The rings have lots of ice and dust in them, which are extremely reflective in infrared light,” says Stefanie Milam, a planetary scientist at NASA’s Goddard Space Flight Center in Greenbelt, Md., and one of JWST’s project scientists. The enormity of the telescope’s mirror also makes its images extra sharp. “JWST was designed to look at the first stars and galaxies across the universe, so we can really see fine details that we haven’t been able to see before,” Milam says.
Upcoming JWST observations will look at Neptune with other scientific instruments. That should provide new intel on the rings’ composition and dynamics, as well as on how Neptune’s clouds and storms evolve, Milam says. “There’s more to come.”
As people around the world marveled in July at the most detailed pictures of the cosmos snapped by the James Webb Space Telescope, biologists got their first glimpses of a different set of images — ones that could help revolutionize life sciences research.
The images are the predicted 3-D shapes of more than 200 million proteins, rendered by an artificial intelligence system called AlphaFold. “You can think of it as covering the entire protein universe,” said Demis Hassabis at a July 26 news briefing. Hassabis is cofounder and CEO of DeepMind, the London-based company that created the system. Combining several deep-learning techniques, the computer program is trained to predict protein shapes by recognizing patterns in structures that have already been solved through decades of experimental work using electron microscopes and other methods. The AI’s first splash came in 2021, with predictions for 350,000 protein structures — including almost all known human proteins. DeepMind partnered with the European Bioinformatics Institute of the European Molecular Biology Laboratory to make the structures available in a public database.
July’s massive new release expanded the library to “almost every organism on the planet that has had its genome sequenced,” Hassabis said. “You can look up a 3-D structure of a protein almost as easily as doing a key word Google search.”
These are predictions, not actual structures. Yet researchers have used some of the 2021 predictions to develop potential new malaria vaccines, improve understanding of Parkinson’s disease, work out how to protect honeybee health, gain insight into human evolution and more. DeepMind has also focused AlphaFold on neglected tropical diseases, including Chagas disease and leishmaniasis, which can be debilitating or lethal if left untreated. The release of the vast dataset was greeted with excitement by many scientists. But others worry that researchers will take the predicted structures as the true shapes of proteins. There are still things AlphaFold can’t do — and wasn’t designed to do — that need to be tackled before the protein cosmos completely comes into focus.
Having the new catalog open to everyone is “a huge benefit,” says Julie Forman-Kay, a protein biophysicist at the Hospital for Sick Children and the University of Toronto. In many cases, AlphaFold and RoseTTAFold, another AI researchers are excited about, predict shapes that match up well with protein profiles from experiments. But, she cautions, “it’s not that way across the board.”
Predictions are more accurate for some proteins than for others. Erroneous predictions could leave some scientists thinking they understand how a protein works when really, they don’t. Painstaking experiments remain crucial to understanding how proteins fold, Forman-Kay says. “There’s this sense now that people don’t have to do experimental structure determination, which is not true.” Plodding progress Proteins start out as long chains of amino acids and fold into a host of curlicues and other 3-D shapes. Some resemble the tight corkscrew ringlets of a 1980s perm or the pleats of an accordion. Others could be mistaken for a child’s spiraling scribbles.
A protein’s architecture is more than just aesthetics; it can determine how that protein functions. For instance, proteins called enzymes need a pocket where they can capture small molecules and carry out chemical reactions. And proteins that work in a protein complex, two or more proteins interacting like parts of a machine, need the right shapes to snap into formation with their partners.
Knowing the folds, coils and loops of a protein’s shape may help scientists decipher how, for example, a mutation alters that shape to cause disease. That knowledge could also help researchers make better vaccines and drugs.
For years, scientists have bombarded protein crystals with X-rays, flash frozen cells and examined them under highpowered electron microscopes, and used other methods to discover the secrets of protein shapes. Such experimental methods take “a lot of personnel time, a lot of effort and a lot of money. So it’s been slow,” says Tamir Gonen, a membrane biophysicist and Howard Hughes Medical Institute investigator at the David Geffen School of Medicine at UCLA. Such meticulous and expensive experimental work has uncovered the 3-D structures of more than 194,000 proteins, their data files stored in the Protein Data Bank, supported by a consortium of research organizations. But the accelerating pace at which geneticists are deciphering the DNA instructions for making proteins has far outstripped structural biologists’ ability to keep up, says systems biologist Nazim Bouatta of Harvard Medical School. “The question for structural biologists was, how do we close the gap?” he says.
For many researchers, the dream has been to have computer programs that could examine the DNA of a gene and predict how the protein it encodes would fold into a 3-D shape.
Here comes AlphaFold Over many decades, scientists made progress toward that AI goal. But “until two years ago, we were really a long way from anything like a good solution,” says John Moult, a computational biologist at the University of Maryland’s Rockville campus.
Moult is one of the organizers of a competition: the Critical Assessment of protein Structure Prediction, or CASP. Organizers give competitors a set of proteins for their algorithms to fold and compare the machines’ predictions against experimentally determined structures. Most AIs failed to get close to the actual shapes of the proteins. Then in 2020, AlphaFold showed up in a big way, predicting the structures of 90 percent of test proteins with high accuracy, including two-thirds with accuracy rivaling experimental methods.
Deciphering the structure of single proteins had been the core of the CASP competition since its inception in 1994. With AlphaFold’s performance, “suddenly, that was essentially done,” Moult says.
Since AlphaFold’s 2021 release, more than half a million scientists have accessed its database, Hassabis said in the news briefing. Some researchers, for example, have used AlphaFold’s predictions to help them get closer to completing a massive biological puzzle: the nuclear pore complex. Nuclear pores are key portals that allow molecules in and out of cell nuclei. Without the pores, cells wouldn’t work properly. Each pore is huge, relatively speaking, composed of about 1,000 pieces of 30 or so different proteins. Researchers had previously managed to place about 30 percent of the pieces in the puzzle. That puzzle is now almost 60 percent complete, after combining AlphaFold predictions with experimental techniques to understand how the pieces fit together, researchers reported in the June 10 Science.
Now that AlphaFold has pretty much solved how to fold single proteins, this year CASP organizers are asking teams to work on the next challenges: Predict the structures of RNA molecules and model how proteins interact with each other and with other molecules.
For those sorts of tasks, Moult says, deep-learning AI methods “look promising but have not yet delivered the goods.”
Where AI falls short Being able to model protein interactions would be a big advantage because most proteins don’t operate in isolation. They work with other proteins or other molecules in cells. But AlphaFold’s accuracy at predicting how the shapes of two proteins might change when the proteins interact are “nowhere near” that of its spot-on projections for a slew of single proteins, says Forman-Kay, the University of Toronto protein biophysicist. That’s something AlphaFold’s creators acknowledge too.
The AI trained to fold proteins by examining the contours of known structures. And many fewer multiprotein complexes than single proteins have been solved experimentally. Forman-Kay studies proteins that refuse to be confined to any particular shape. These intrinsically disordered proteins are typically as floppy as wet noodles (SN: 2/9/13, p. 26). Some will fold into defined forms when they interact with other proteins or molecules. And they can fold into new shapes when paired with different proteins or molecules to do various jobs.
AlphaFold’s predicted shapes reach a high confidence level for about 60 percent of wiggly proteins that Forman-Kay and colleagues examined, the team reported in a preliminary study posted in February at bioRxiv.org. Often the program depicts the shapeshifters as long corkscrews called alpha helices.
Forman-Kay’s group compared AlphaFold’s predictions for three disordered proteins with experimental data. The structure that the AI assigned to a protein called alpha-synuclein resembles the shape that the protein takes when it interacts with lipids, the team found. But that’s not the way the protein looks all the time.
For another protein, called eukaryotic translation initiation factor 4E-binding protein 2, AlphaFold predicted a mishmash of the protein’s two shapes when working with two different partners. That Frankenstein structure, which doesn’t exist in actual organisms, could mislead researchers about how the protein works, Forman-Kay and colleagues say. AlphaFold may also be a little too rigid in its predictions. A static “structure doesn’t tell you everything about how a protein works,” says Jane Dyson, a structural biologist at the Scripps Research Institute in La Jolla, Calif. Even single proteins with generally well-defined structures aren’t frozen in space. Enzymes, for example, undergo small shape changes when shepherding chemical reactions.
If you ask AlphaFold to predict the structure of an enzyme, it will show a fixed image that may closely resemble what scientists have determined by X-ray crystallography, Dyson says. “But [it will] not show you any of the subtleties that are changing as the different partners” interact with the enzyme.
“The dynamics are what Mr. AlphaFold can’t give you,” Dyson says.
A revolution in the making The computer renderings do give biologists a head start on solving problems such as how a drug might interact with a protein. But scientists should remember one thing: “These are models,” not experimentally deciphered structures, says Gonen, at UCLA.
He uses AlphaFold’s protein predictions to help make sense of experimental data, but he worries that researchers will accept the AI’s predictions as gospel. If that happens, “the risk is that it will become harder and harder and harder to justify why you need to solve an experimental structure.” That could lead to reduced funding, talent and other resources for the types of experiments needed to check the computer’s work and forge new ground, he says. Harvard Medical School’s Bouatta is more optimistic. He thinks that researchers probably don’t need to invest experimental resources in the types of proteins that AlphaFold does a good job of predicting, which should help structural biologists triage where to put their time and money.
“There are proteins for which AlphaFold is still struggling,” Bouatta agrees. Researchers should spend their capital there, he says. “Maybe if we generate more [experimental] data for those challenging proteins, we could use them for retraining another AI system” that could make even better predictions.
He and colleagues have already reverse engineered AlphaFold to make a version called OpenFold that researchers can train to solve other problems, such as those gnarly but important protein complexes.
Massive amounts of DNA generated by the Human Genome Project have made a wide range of biological discoveries possible and opened up new fields of research (SN: 2/12/22, p. 22). Having structural information on 200 million proteins could be similarly revolutionary, Bouatta says.
In the future, thanks to AlphaFold and its AI kin, he says, “we don’t even know what sorts of questions we might be asking.”
The answer to one of the greatest mysteries of the universe may come down to one of the smallest, and spookiest, particles.
Matter is common in the cosmos. Everything around us — from planets to stars to puppies — is made up of matter. But matter has a flip side: antimatter. Protons, electrons and other particles all have antimatter counterparts: antiprotons, positrons, etc. Yet for some reason antimatter is much rarer than matter — and no one knows why. Physicists believe the universe was born with equal amounts of matter and antimatter. Since matter and antimatter counterparts annihilate on contact, that suggests the universe should have ended up with nothing but energy. Something must have tipped the balance.
Some physicists think lightweight subatomic particles called neutrinos could point to an answer. These particles are exceedingly tiny, with less than a millionth the mass of an electron (SN: 4/21/21). They’re produced in radioactive decays and in the sun and other cosmic environments. Known for their ethereal tendency to evade detection, neutrinos have earned the nickname “ghost particles.” These spooky particles, originally thought to have no mass at all, have a healthy track record of producing scientific surprises (SN: 10/6/15).
Now researchers are building enormous detectors to find out if neutrinos could help solve the mystery of the universe’s matter. The Hyper-Kamiokande experiment in Hida City, Japan, and the Deep Underground Neutrino Experiment in Lead, S.D., will study neutrinos and their antimatter counterparts, antineutrinos. A difference in neutrinos’ and antineutrinos’ behavior might hint at the origins of the matter-antimatter imbalance, scientists suspect.
Watch the video below to find out how neutrinos might reveal why the universe contains, well, anything at all.
Face masks — the unofficial symbol of the COVID-19 pandemic — are leveling up.
A mask outfitted with special electronics can detect SARS-CoV-2, the virus that causes COVID-19, and other airborne viruses within 10 minutes of exposure, materials researcher Yin Fang and colleagues report September 19 in Matter.
“The lightness and wearability of this face mask allows users to wear it anytime, anywhere,” says Fang, of Tongji University in Shanghai. “It’s expected to serve as an early warning system to prevent large outbreaks of respiratory infectious diseases.” Airborne viruses can hitch a ride between hosts in the air droplets that people breathe in and out. People infected with a respiratory illness can expel thousands of virus-containing droplets by talking, coughing and sneezing. Even those with no signs of being sick can sometimes pass on these viruses; people who are infected with SARS-CoV-2 can start infecting others at least two to three days before showing symptoms (SN: 3/13/20). So viruses often have a head start when it comes to infecting new people.
Fang and his colleagues designed a special sensor that reacts to the presence of certain viral proteins in the air and attached it to a face mask. The team then spritzed droplets containing proteins produced by the viruses that cause COVID-19, bird flu or swine flu into a chamber with the mask.
The sensor could detect just a fraction of a microliter of these proteins — a cough might contain 10 to 80 times as much. Once a pathogen was detected, the sensor-mask combo sent a signal to the researchers informing them of the virus’s presence. Ultimately, the researchers plan for such signals to be sent to a wearer’s phone or other devices. By combining this technology with more conventional testing, the team says, health care providers and public health officials might be able to better contain future pandemics.
Getting out into society after a long isolation gets awkward. Ask the Pahrump poolfish, loners in a desert for some 10,000 years.
This hold-in-your-hand-size fish (Empetrichthys latos) has a chubby, torpedo shape and a mouth that looks as if it’s almost smiling. Until the 1950s, this species had three forms, each evolving in its own spring. Now only one survives, which developed in a spring-fed oasis in the Mojave Desert’s Pahrump Valley, about an hour’s drive west of Las Vegas.
Fish in a desert are not that weird when you take the long view (SN: 1/26/16). In a former life, some desert valleys were ancient lakes. As the region’s lakes dried up, fish got stuck in the remaining puddles. Various stranded species over time adapted to quirks of their private microlakes, and a desert-fish version of the Galapagos Islands’ diverse finches arose. “We like to say that Darwin, if he had a different travel agent, could have come to the same conclusions just from the desert,” says evolutionary biologist Craig Stockwell of North Dakota State University in Fargo.
The desert “island” where E. latos evolved was Manse Spring on a private ranch. From a distance, the spring looked “just like a little clump of trees,” remembers ecologist Shawn Goodchild, who is now based in Lake Park, Minn. The spot of desert greenery surrounded the Pahrump poolfish’s entire native range, about the length of an Olympic swimming pool.
By the 1960s, biologists feared the fish were doomed. The spring’s flow rate had dropped some 70 percent as irrigation for farms in the desert sucked away water. And disastrous predators arrived: a kid’s discarded goldfish. Conservation managers fought back, but neither poison nor dynamite wiped out the newcomers. And then in August of 1975, Manse Spring dried up.
Conservation managers had moved some poolfish to other springs, but the long-isolated species just didn’t seem to get the dangers of living with other kinds of fishes. The poolfish were easily picked off by predators in their new home.
Lab tests of fake fish-murder scenes may help explain why. For instance, researchers tainted aquarium water with pureed fish bits. In an expected reaction, fathead minnows (Pimephales promelas) freaked at traces of dead minnow drifting through the water and huddled low in the tank. The Pahrump poolfish in water tainted with blender-whizzed skin of their kind just kept swimming around the upper waters as if corpse taint were no scarier than tap water. Literally. Stockwell and colleagues can say that because they ran a fear test with nonscary dechlorinated tap water. Poolfish didn’t huddle then either, the team reports in the Aug. 31 Proceedings of the Royal Society B.
Then, however, Stockwell and a colleague were musing about some rescued poolfish in cattle tanks when nearby dragonflies caught the researchers’ attention.
Before dragonflies mature into shimmering aerial marvels, the young prowl underwater as violent predators. In moves worthy of scary aliens in a sci-fi movie, many dragonfly nymphs can shoot their jaws out from their head to scoop up prey, including fish eggs and fish larvae. With young dragonflies prowling a pool’s bottom and plants, poolfish moving up the water column “would be a good way to reduce their risk,” Stockwell says. Testing of that idea has begun.
Fish that people thought were foolishly naïve may just be savvy in a different way. Especially after isolation in a desert with dragons.