With the blessings of all 14 families of lost astronauts, a new memorial to the Challenger and Columbia space shuttle disasters opened in June at the Kennedy Space Center in Florida. The permanent exhibit includes the first pieces of shuttle wreckage ever on public display, but fittingly focuses more on the lives lost.
“Forever Remembered” is housed inside the space center’s new $100 million exhibit about the space shuttle Atlantis. Below the nose of the intact shuttle, visitors enter a hall lit by tributes to each astronaut from the lost missions, those from Challenger on the left and Columbia on the right. Each display includes glimpses of the astronaut’s life. Items include plans for remodeling the home of Challenger pilot Michael Smith and a recovered page in Hebrew from the Columbia flight journal of Ilan Ramon, a payload specialist and the first Israeli astronaut. Past the hall, visitors enter a small gallery with a single piece of each shuttle: a body panel from Challenger (shown at left) and cockpit window frames from Columbia . There are no extended written descriptions or flashy videos. In short, it’s a place for pondering rather than learning. As a ninth-grader in school 50 miles away when Challenger exploded in 1986 and as an adult who waited for a telltale sonic boom that never came when Columbia was lost during re-entry in 2003, I found the effect powerful. The exhibit’s exit hallway reveals the tragedies from multiple perspectives on video displays. One video details the massive efforts to recover the wreckage and remains from the disasters, from the ocean for Challenger and from land for Columbia. Others focus on the emotional tolls and the critical shuttle launches that followed each completed investigation.
Michael Curie, Kennedy Space Center’s news chief, says family members have been both supportive and grateful for the exhibit. “They feel that it humanizes their family members in a way that never has been done before,” he says. Indeed, “Forever Remembered” is an effective reminder of the very real risks each astronaut willingly and bravely faced.
A female giraffe has a great Valentine’s Day gift for potential mates: urine.
Distinctive anatomy helps male giraffes get a taste for whether a female is ready to mate, animal behaviorists Lynette and Benjamin Hart report January 19 in Animals. A pheromone-detecting organ in giraffes has a stronger connection to the mouth than the nose, the researchers found. That’s why males scope out which females to mate with by sticking their tongues in a urine stream. Animals such as male gazelles will lick fresh urine on the ground to track if females are ready to mate. But giraffes’ long necks and heavy heads make bending over to investigate urine on the ground an unstable and vulnerable position, says Lynette Hart, of the University of California, Davis.
The researchers observed giraffes (Giraffa giraffa angolensis) in Etosha National Park in Namibia in 1994, 2002 and 2004. Bull giraffes nudged or kicked the female to ask her to pee. If she was a willing participant, she urinated for a few seconds, while the male took a sip. Then the male curled his lip and inhaled with his mouth, a behavior called a flehmen response, to pull the female’s scent into two openings on the roof of the mouth. From the mouth, the scent travels to the vomeronasal organ, or VNO, which detects pheromones.
The Harts say they never saw a giraffe investigate urine on the ground.
Unlike many other mammals, giraffes have a stronger oral connection — via a duct — to the VNO, than a nasal one, examinations of preserved giraffe specimens showed. One possible explanation for the difference could be that a VNO-nose link helps animals that breed at specific times of the year detect seasonal plants, says Benjamin Hart, a veterinarian also at the University of California, Davis. But giraffes can mate any time of year, so the nasal connection may not matter as much.
Animals cover themselves in all kinds of unsavory fluids to keep cool. Humans sweat, kangaroos spit and some birds will urinate on themselves to survive hot days. It turns out that echidnas do something much cuter — though perhaps just as sticky (and slightly icky) — to beat the heat.
The spiny insectivores stay cool by blowing snot bubbles, researchers report January 18 in Biology Letters. The bubbles pop, keeping the critters’ noses moist. As it evaporates, this moisture draws heat away from a blood-filled sinus in the echidna’s beak, helping to cool the animal’s blood. Short-beaked echidnas (Tachyglossus aculeatus) look a bit like hedgehogs but are really monotremes — egg-laying mammals unique to Australia and New Guinea (SN: 11/18/16). Previous lab studies showed that temperatures above 35° Celsius (95° Fahrenheit) should kill echidnas. But echidnas don’t seem to have gotten the memo. They live everywhere from tropical rainforests to deserts to snow-capped peaks, leaving scientists with a physiological puzzle.
Mammals evaporate water to keep cool when temperatures climb above their body temperatures, says environmental physiologist Christine Cooper of Curtin University in Perth, Australia. “Lots of mammals do that by either licking, sweating or panting,” she says. “Echidnas weren’t believed to be able to do that.” But it’s known that the critters blow snot bubbles when it gets hot.
So, armed with a heat-vision camera and a telephoto lens, Cooper and environmental physiologist Philip Withers of the University of Western Australia in Perth drove through nature reserves in Western Australia once a month for a year to film echidnas.
In infrared, the warmest parts of the echidnas’ spiny bodies glowed in oranges, yellows and whites. But the video revealed that the tips of their noses were dark purple blobs, kept cool as moisture from their snot bubbles evaporated. Echidnas might also lose heat through their bellies and legs, the researchers report, while their spikes could act as an insulator. “Finding a way of doing this work in the field is pretty exciting,” says physiological ecologist Stewart Nicol of the University of Tasmania in Hobart, Australia, who was not involved in the study. “You can understand animals and see how they’re responding to their normal environment.” The next step, he says, is to quantify how much heat echidnas really lose through their noses and other body parts.
Monotremes parted evolutionary ways with other mammals between 250 million and 160 million years ago as the supercontinent Pangaea broke apart (SN: 3/8/15). So “they have a whole lot of traits that are considered to be primitive,” Cooper says. “Understanding how they might thermoregulate can give us some ideas about how thermal regulation … might have evolved in mammals.”
More than a century ago, scientists proved that carbon dioxide in Earth’s atmosphere could act like a thermostat — adding more CO2 would turn up the heat, removing it would chill the planet. But back then, most scientists thought that Earth’s climate system was far too large and stable to change quickly, that any fluctuations would happen over such a long timescale that it wouldn’t matter much to everyday life (SN: 3/12/22, p. 16).
Now all it takes is a look at the Weather Channel to know how wrong scientists were. Things are changing fast. Last year alone, Europe, South Asia, China, Japan and the American West endured deadly, record-breaking heat waves (SN: 12/17/22 & 12/31/22, p. 38). As I write this, torrential rains are bringing death and destruction to California. And with levels of climate-warming gases continuing to increase in the atmosphere, extreme weather events will become even more frequent. Given the vastness of this threat, it’s tempting to think that any efforts that we make against it will be futile. But that’s not true. Around the world, scientists and engineers; entrepreneurs and large corporations; state, national and local governments; and international coalitions are acting to put the brakes on climate change. Last year, the United States signed into law a $369 billion investment in renewable energy technologies and other responses (SN: 12/17/22 & 12/31/22, p. 28). And the World Bank invested $31.7 billion to assist other countries.
In this issue, contributing correspondent Alexandra Witze details the paths forward: which responses will help the most, and which remain challenging. Shifting to renewable energy sources like wind and solar should be the easiest. We already have the technology, and costs have plunged over the last decade. Other approaches that are feasible but not as far along include making industrial processes more energy efficient, trapping greenhouse gases and developing clean fuels. Ultimately, the goal is to reinvent the global energy infrastructure. Societies have been retooling energy infrastructures for centuries, from water and steam power to petroleum and natural gas to nuclear power and now renewables. This next transformation will be the biggest yet. But we have the scientific understanding and technological savvy to make it happen.
This cover story kicks off a new series for Science News, The Climate Fix. In future issues, we will focus on covering solutions to the climate crisis, including the science behind innovations, the people making them happen, and the social and environmental impacts. You’ll also see expanded climate coverage for our younger readers, ages 9 and up, at Science News Explores online and in print.
With this issue, we also welcome our new publisher, Michael Gordon Voss. He comes to us with deep knowledge of the media industry, experience in both for-profit and nonprofit publishing and a love of science. Before joining Science News Media Group, Voss was publisher of Stanford Social Innovation Review, and vice president and associate publisher at Scientific American. With his arrival, publisher Maya Ajmera takes on her new role as executive publisher. Under her leadership, we have seen unprecedented growth. We’re fortunate to have these two visionaries directing our business strategy amid a rapidly changing media environment.
Patricia Hidalgo-Gonzalez saw the future of energy on a broiling-hot day last September.
An email alert hit her inbox from the San Diego Gas & Electric Company. “Extreme heat straining the grid,” read the message, which was also pinged as a text to 27 million people. “Save energy to help avoid power interruptions.”
It worked. People cut their energy use. Demand plunged, blackouts were avoided and California successfully weathered a crisis exacerbated by climate change. “It was very exciting to see,” says Hidalgo-Gonzalez, an electrical engineer at the University of California, San Diego who studies renewable energy and the power grid. This kind of collective societal response, in which we reshape how we interact with the systems that provide us energy, will be crucial as we figure out how to live on a changing planet.
Earth has warmed at least 1.1 degrees Celsius since the 19th century, when the burning of coal, oil and other fossil fuels began belching heat-trapping gases such as carbon dioxide into the atmosphere. Scientists agree that only drastic action to cut emissions can keep the planet from blasting past 1.5 degrees of warming — a threshold beyond which the consequences become even more catastrophic than the rising sea levels, extreme weather and other impacts the world is already experiencing.
The goal is to achieve what’s known as net-zero emissions, where any greenhouse gases still entering the atmosphere are balanced by those being removed — and to do it as soon as we can.
Scientists say it is possible to swiftly transform the ways we produce and consume energy. To show the way forward, researchers have set out paths toward a world where human activities generate little to no carbon dioxide and other greenhouse gases — a decarbonized economy.
The key to a decarbonized future lies in producing vast amounts of new electricity from sources that emit little to none of the gases, such as wind, solar and hydropower, and then transforming as much of our lives and our industries as possible to run off those sources. Clean electricity needs to power not only the planet’s current energy use but also the increased demands of a growing global population.
Once humankind has switched nearly entirely to clean electricity, we will also have to counterbalance the carbon dioxide we still emit — yes, we will still emit some — by pulling an equivalent amount of carbon dioxide out of the atmosphere and storing it somewhere permanently.
Achieving net-zero emissions won’t be easy. Getting to effective and meaningful action on climate change requires overcoming decades of inertia and denial about the scope and magnitude of the problem. Nations are falling well short of existing pledges to reduce emissions, and global warming remains on track to charge past 1.5 degrees perhaps even by the end of this decade.
Yet there is hope. The rate of growth in CO2 emissions is slowing globally — down from 3 percent annual growth in the 2000s to half a percent annual growth in the last decade, according to the Global Carbon Project, which quantifies greenhouse gas emissions.
There are signs annual emissions could start shrinking. And over the last two years, the United States, by far the biggest cumulative contributor to global warming, has passed several pieces of federal legislation that include financial incentives to accelerate the transition to clean energy. “We’ve never seen anything at this scale,” says Erin Mayfield, an energy researcher at Dartmouth College.
Though the energy transition will require many new technologies, such as innovative ways to permanently remove carbon from the atmosphere, many of the solutions, such as wind and solar power, are in hand — “stuff we already have,” Mayfield says. The current state of carbon dioxide emissions Of all the emissions that need to be slashed, the most important is carbon dioxide, which comes from many sources such as cars and trucks and coal-burning power plants. The gas accounted for 79 percent of U.S. greenhouse gas emissions in 2020. The next most significant greenhouse gas, at 11 percent of emissions in the United States, is methane, which comes from oil and gas operations as well as livestock, landfills and other land uses.
The amount of methane may seem small, but it is mighty — over the short term, methane is more than 80 times as efficient at trapping heat as carbon dioxide is, and methane’s atmospheric levels have nearly tripled in the last two centuries. Other greenhouse gases include nitrous oxides, which come from sources such as applying fertilizer to crops or burning fuels and account for 7 percent of U.S. emissions, and human-made fluorinated gases such as hydrofluorocarbons that account for 3 percent.
Globally, emissions are dominated by large nations that produce lots of energy. The United States alone emits around 5 billion metric tons of carbon dioxide each year. It is responsible for most of the greenhouse gas emissions throughout history and ceded the spot for top annual emitter to China only in the mid-2000s. India ranks third.
Because of the United States’ role in producing most of the carbon pollution to date, many researchers and advocates argue that it has the moral responsibility to take the global lead on cutting emissions. And the United States has the most ambitious goals of the major emitters, at least on paper. President Joe Biden has said the country is aiming to reach net-zero emissions by 2050. Leaders in China and India have set net-zero goals of 2060 and 2070, respectively.
Under the auspices of a 2015 international climate change treaty known as the Paris agreement, 193 nations plus the European Union have pledged to reduce their emissions. The agreement aims to keep global warming well below 2 degrees, and ideally to 1.5 degrees, above preindustrial levels. But it is insufficient. Even if all countries cut their emissions as much as they have promised under the Paris agreement, the world would likely blow past 2 degrees of warming before the end of this century.
Every nation continues to find its own path forward. “At the end of the day, all the solutions are going to be country-specific,” says Sha Yu, an earth scientist at the Pacific Northwest National Laboratory and University of Maryland’s Joint Global Change Research Institute in College Park, Md. “There’s not a universal fix.”
But there are some common themes for how to accomplish this energy transition — ways to focus our efforts on the things that will matter most. These are efforts that go beyond individual consumer choices such as whether to fly less or eat less meat. They instead penetrate every aspect of how society produces and consumes energy.
Such massive changes will need to overcome a lot of resistance, including from companies that make money off old forms of energy as well as politicians and lobbyists. But if society can make these changes, it will rank as one of humanity’s greatest accomplishments. We will have tackled a problem of our own making and conquered it.
Here’s a look at what we’ll need to do.
Make as much clean electricity as possible To meet the need for energy without putting carbon dioxide into the atmosphere, countries would need to dramatically scale up the amount of clean energy they produce. Fortunately, most of that energy would be generated by technologies we already have — renewable sources of energy including wind and solar power.
“Renewables, far and wide, are the key pillar in any net-zero scenario,” says Mayfield, who worked on an influential 2021 report from Princeton University’s Net-Zero America project, which focused on the U.S. economy.
The Princeton report envisions wind and solar power production roughly quadrupling by 2030 to get the United States to net-zero emissions by 2050. That would mean building many new solar and wind farms, so many that in the most ambitious scenario, wind turbines would cover an area the size of Arkansas, Iowa, Kansas, Missouri, Nebraska and Oklahoma combined. Such a scale-up is only possible because prices to produce renewable energy have plunged. The cost of wind power has dropped nearly 70 percent, and solar power nearly 90 percent, over the last decade in the United States. “That was a game changer that I don’t know if some people were expecting,” Hidalgo-Gonzalez says.
Globally the price drop in renewables has allowed growth to surge; China, for instance, installed a record 55 gigawatts of solar power capacity in 2021, for a total of 306 gigawatts or nearly 13 percent of the nation’s installed capacity to generate electricity. China is almost certain to have had another record year for solar power installations in 2022.
Challenges include figuring out ways to store and transmit all that extra electricity, and finding locations to build wind and solar power installations that are acceptable to local communities. Other types of low-carbon power, such as hydropower and nuclear power, which comes with its own public resistance, will also likely play a role going forward. Get efficient and go electric The drive toward net-zero emissions also requires boosting energy efficiency across industries and electrifying as many aspects of modern life as possible, such as transportation and home heating.
Some industries are already shifting to more efficient methods of production, such as steelmaking in China that incorporates hydrogen-based furnaces that are much cleaner than coal-fired ones, Yu says. In India, simply closing down the most inefficient coal-burning power plants provides the most bang for the buck, says Shayak Sengupta, an energy and policy expert at the Observer Research Foundation America think tank in Washington, D.C. “The list has been made up,” he says, of the plants that should close first, “and that’s been happening.”
To achieve net-zero, the United States would need to increase its share of electric heat pumps, which heat houses much more cleanly than gas- or oil-fired appliances, from around 10 percent in 2020 to as much as 80 percent by 2050, according to the Princeton report. Federal subsidies for these sorts of appliances are rolling out in 2023 as part of the new Inflation Reduction Act, legislation that contains a number of climate-related provisions.
Shifting cars and other vehicles away from burning gasoline to running off of electricity would also lead to significant emissions cuts. In a major 2021 report, the National Academies of Sciences, Engineering and Medicine said that one of the most important moves in decarbonizing the U.S. economy would be having electric vehicles account for half of all new vehicle sales by 2030. That’s not impossible; electric car sales accounted for nearly 6 percent of new sales in the United States in 2022, which is still a low number but nearly double the previous year.
Make clean fuels Some industries such as manufacturing and transportation can’t be fully electrified using current technologies — battery powered airplanes, for instance, will probably never be feasible for long-duration flights. Technologies that still require liquid fuels will need to switch from gas, oil and other fossil fuels to low-carbon or zero-carbon fuels.
One major player will be fuels extracted from plants and other biomass, which take up carbon dioxide as they grow and emit it when they die, making them essentially carbon neutral over their lifetime. To create biofuels, farmers grow crops, and others process the harvest in conversion facilities into fuels such as hydrogen. Hydrogen, in turn, can be substituted for more carbon-intensive substances in various industrial processes such as making plastics and fertilizers — and maybe even as fuel for airplanes someday.
In one of the Princeton team’s scenarios, the U.S. Midwest and Southeast would become peppered with biomass conversion plants by 2050, so that fuels can be processed close to where crops are grown. Many of the biomass feedstocks could potentially grow alongside food crops or replace other, nonfood crops. Cut methane and other non-CO2 emissions Greenhouse gas emissions other than carbon dioxide will also need to be slashed. In the United States, the majority of methane emissions come from livestock, landfills and other agricultural sources, as well as scattered sources such as forest fires and wetlands. But about one-third of U.S. methane emissions come from oil, gas and coal operations. These may be some of the first places that regulators can target for cleanup, especially “super emitters” that can be pinpointed using satellites and other types of remote sensing.
In 2021, the United States and the European Union unveiled what became a global methane pledge endorsed by 150 countries to reduce emissions. There is, however, no enforcement of it yet. And China, the world’s largest methane emitter, has not signed on.
Nitrous oxides could be reduced by improving soil management techniques, and fluorinated gases by finding alternatives and improving production and recycling efforts.
Sop up as much CO2 as possible Once emissions have been cut as much as possible, reaching net-zero will mean removing and storing an equivalent amount of carbon to what society still emits.
One solution already in use is to capture carbon dioxide produced at power plants and other industrial facilities and store it permanently somewhere, such as deep underground. Globally there are around 35 such operations, which collectively draw down around 45 million tons of carbon dioxide annually. About 200 new plants are on the drawing board to be operating by the end of this decade, according to the International Energy Agency.
The Princeton report envisions carbon capture being added to almost every kind of U.S. industrial plant, from cement production to biomass conversion. Much of the carbon dioxide would be liquefied and piped along more than 100,000 kilometers of new pipelines to deep geologic storage, primarily along the Texas Gulf Coast, where underground reservoirs can be used to trap it permanently. This would be a massive infrastructure effort. Building this pipeline network could cost up to $230 billion, including $13 billion for early buy-in from local communities and permitting alone.
Another way to sop up carbon is to get forests and soils to take up more. That could be accomplished by converting crops that are relatively carbon-intensive, such as corn to be used in ethanol, to energy-rich grasses that can be used for more efficient biofuels, or by turning some cropland or pastures back into forest. It’s even possible to sprinkle crushed rock onto croplands, which accelerates natural weathering processes that suck carbon dioxide out of the atmosphere.
Another way to increase the amount of carbon stored in the land is to reduce the amount of the Amazon rainforest that is cut down each year. “For a few countries like Brazil, preventing deforestation will be the first thing you can do,” Yu says.
When it comes to climate change, there’s no time to waste The Princeton team estimates that the United States would need to invest at least an additional $2.5 trillion over the next 10 years for the country to have a shot at achieving net-zero emissions by 2050. Congress has begun ramping up funding with two large pieces of federal legislation it passed in 2021 and 2022. Those steer more than $1 trillion toward modernizing major parts of the nation’s economy over a decade — including investing in the energy transition to help fight climate change.
Between now and 2030, solar and wind power, plus increasing energy efficiency, can deliver about half of the emissions reductions needed for this decade, the International Energy Agency estimates. After that, the primary drivers would need to be increasing electrification, carbon capture and storage, and clean fuels such as hydrogen. The trick is to do all of this without making people’s lives worse. Developing nations need to be able to supply energy for their economies to develop. Communities whose jobs relied on fossil fuels need to have new economic opportunities.
Julia Haggerty, a geographer at Montana State University in Bozeman who studies communities that are dependent on natural resources, says that those who have money and other resources to support the transition will weather the change better than those who are under-resourced now. “At the landscape of states and regions, it just remains incredibly uneven,” she says.
The ongoing energy transition also faces unanticipated shocks such as Russia’s invasion of Ukraine, which sent energy prices soaring in Europe, and the COVID-19 pandemic, which initially slashed global emissions but later saw them rebound.
But the technologies exist for us to wean our lives off fossil fuels. And we have the inventiveness to develop more as needed. Transforming how we produce and use energy, as rapidly as possible, is a tremendous challenge — but one that we can meet head-on. For Mayfield, getting to net-zero by 2050 is a realistic goal for the United States. “I think it’s possible,” she says. “But it doesn’t mean there’s not a lot more work to be done.”
As people around the world marveled in July at the most detailed pictures of the cosmos snapped by the James Webb Space Telescope, biologists got their first glimpses of a different set of images — ones that could help revolutionize life sciences research.
The images are the predicted 3-D shapes of more than 200 million proteins, rendered by an artificial intelligence system called AlphaFold. “You can think of it as covering the entire protein universe,” said Demis Hassabis at a July 26 news briefing. Hassabis is cofounder and CEO of DeepMind, the London-based company that created the system. Combining several deep-learning techniques, the computer program is trained to predict protein shapes by recognizing patterns in structures that have already been solved through decades of experimental work using electron microscopes and other methods. The AI’s first splash came in 2021, with predictions for 350,000 protein structures — including almost all known human proteins. DeepMind partnered with the European Bioinformatics Institute of the European Molecular Biology Laboratory to make the structures available in a public database.
July’s massive new release expanded the library to “almost every organism on the planet that has had its genome sequenced,” Hassabis said. “You can look up a 3-D structure of a protein almost as easily as doing a key word Google search.”
These are predictions, not actual structures. Yet researchers have used some of the 2021 predictions to develop potential new malaria vaccines, improve understanding of Parkinson’s disease, work out how to protect honeybee health, gain insight into human evolution and more. DeepMind has also focused AlphaFold on neglected tropical diseases, including Chagas disease and leishmaniasis, which can be debilitating or lethal if left untreated. The release of the vast dataset was greeted with excitement by many scientists. But others worry that researchers will take the predicted structures as the true shapes of proteins. There are still things AlphaFold can’t do — and wasn’t designed to do — that need to be tackled before the protein cosmos completely comes into focus.
Having the new catalog open to everyone is “a huge benefit,” says Julie Forman-Kay, a protein biophysicist at the Hospital for Sick Children and the University of Toronto. In many cases, AlphaFold and RoseTTAFold, another AI researchers are excited about, predict shapes that match up well with protein profiles from experiments. But, she cautions, “it’s not that way across the board.”
Predictions are more accurate for some proteins than for others. Erroneous predictions could leave some scientists thinking they understand how a protein works when really, they don’t. Painstaking experiments remain crucial to understanding how proteins fold, Forman-Kay says. “There’s this sense now that people don’t have to do experimental structure determination, which is not true.” Plodding progress Proteins start out as long chains of amino acids and fold into a host of curlicues and other 3-D shapes. Some resemble the tight corkscrew ringlets of a 1980s perm or the pleats of an accordion. Others could be mistaken for a child’s spiraling scribbles.
A protein’s architecture is more than just aesthetics; it can determine how that protein functions. For instance, proteins called enzymes need a pocket where they can capture small molecules and carry out chemical reactions. And proteins that work in a protein complex, two or more proteins interacting like parts of a machine, need the right shapes to snap into formation with their partners.
Knowing the folds, coils and loops of a protein’s shape may help scientists decipher how, for example, a mutation alters that shape to cause disease. That knowledge could also help researchers make better vaccines and drugs.
For years, scientists have bombarded protein crystals with X-rays, flash frozen cells and examined them under highpowered electron microscopes, and used other methods to discover the secrets of protein shapes. Such experimental methods take “a lot of personnel time, a lot of effort and a lot of money. So it’s been slow,” says Tamir Gonen, a membrane biophysicist and Howard Hughes Medical Institute investigator at the David Geffen School of Medicine at UCLA. Such meticulous and expensive experimental work has uncovered the 3-D structures of more than 194,000 proteins, their data files stored in the Protein Data Bank, supported by a consortium of research organizations. But the accelerating pace at which geneticists are deciphering the DNA instructions for making proteins has far outstripped structural biologists’ ability to keep up, says systems biologist Nazim Bouatta of Harvard Medical School. “The question for structural biologists was, how do we close the gap?” he says.
For many researchers, the dream has been to have computer programs that could examine the DNA of a gene and predict how the protein it encodes would fold into a 3-D shape.
Here comes AlphaFold Over many decades, scientists made progress toward that AI goal. But “until two years ago, we were really a long way from anything like a good solution,” says John Moult, a computational biologist at the University of Maryland’s Rockville campus.
Moult is one of the organizers of a competition: the Critical Assessment of protein Structure Prediction, or CASP. Organizers give competitors a set of proteins for their algorithms to fold and compare the machines’ predictions against experimentally determined structures. Most AIs failed to get close to the actual shapes of the proteins. Then in 2020, AlphaFold showed up in a big way, predicting the structures of 90 percent of test proteins with high accuracy, including two-thirds with accuracy rivaling experimental methods.
Deciphering the structure of single proteins had been the core of the CASP competition since its inception in 1994. With AlphaFold’s performance, “suddenly, that was essentially done,” Moult says.
Since AlphaFold’s 2021 release, more than half a million scientists have accessed its database, Hassabis said in the news briefing. Some researchers, for example, have used AlphaFold’s predictions to help them get closer to completing a massive biological puzzle: the nuclear pore complex. Nuclear pores are key portals that allow molecules in and out of cell nuclei. Without the pores, cells wouldn’t work properly. Each pore is huge, relatively speaking, composed of about 1,000 pieces of 30 or so different proteins. Researchers had previously managed to place about 30 percent of the pieces in the puzzle. That puzzle is now almost 60 percent complete, after combining AlphaFold predictions with experimental techniques to understand how the pieces fit together, researchers reported in the June 10 Science.
Now that AlphaFold has pretty much solved how to fold single proteins, this year CASP organizers are asking teams to work on the next challenges: Predict the structures of RNA molecules and model how proteins interact with each other and with other molecules.
For those sorts of tasks, Moult says, deep-learning AI methods “look promising but have not yet delivered the goods.”
Where AI falls short Being able to model protein interactions would be a big advantage because most proteins don’t operate in isolation. They work with other proteins or other molecules in cells. But AlphaFold’s accuracy at predicting how the shapes of two proteins might change when the proteins interact are “nowhere near” that of its spot-on projections for a slew of single proteins, says Forman-Kay, the University of Toronto protein biophysicist. That’s something AlphaFold’s creators acknowledge too.
The AI trained to fold proteins by examining the contours of known structures. And many fewer multiprotein complexes than single proteins have been solved experimentally. Forman-Kay studies proteins that refuse to be confined to any particular shape. These intrinsically disordered proteins are typically as floppy as wet noodles (SN: 2/9/13, p. 26). Some will fold into defined forms when they interact with other proteins or molecules. And they can fold into new shapes when paired with different proteins or molecules to do various jobs.
AlphaFold’s predicted shapes reach a high confidence level for about 60 percent of wiggly proteins that Forman-Kay and colleagues examined, the team reported in a preliminary study posted in February at bioRxiv.org. Often the program depicts the shapeshifters as long corkscrews called alpha helices.
Forman-Kay’s group compared AlphaFold’s predictions for three disordered proteins with experimental data. The structure that the AI assigned to a protein called alpha-synuclein resembles the shape that the protein takes when it interacts with lipids, the team found. But that’s not the way the protein looks all the time.
For another protein, called eukaryotic translation initiation factor 4E-binding protein 2, AlphaFold predicted a mishmash of the protein’s two shapes when working with two different partners. That Frankenstein structure, which doesn’t exist in actual organisms, could mislead researchers about how the protein works, Forman-Kay and colleagues say. AlphaFold may also be a little too rigid in its predictions. A static “structure doesn’t tell you everything about how a protein works,” says Jane Dyson, a structural biologist at the Scripps Research Institute in La Jolla, Calif. Even single proteins with generally well-defined structures aren’t frozen in space. Enzymes, for example, undergo small shape changes when shepherding chemical reactions.
If you ask AlphaFold to predict the structure of an enzyme, it will show a fixed image that may closely resemble what scientists have determined by X-ray crystallography, Dyson says. “But [it will] not show you any of the subtleties that are changing as the different partners” interact with the enzyme.
“The dynamics are what Mr. AlphaFold can’t give you,” Dyson says.
A revolution in the making The computer renderings do give biologists a head start on solving problems such as how a drug might interact with a protein. But scientists should remember one thing: “These are models,” not experimentally deciphered structures, says Gonen, at UCLA.
He uses AlphaFold’s protein predictions to help make sense of experimental data, but he worries that researchers will accept the AI’s predictions as gospel. If that happens, “the risk is that it will become harder and harder and harder to justify why you need to solve an experimental structure.” That could lead to reduced funding, talent and other resources for the types of experiments needed to check the computer’s work and forge new ground, he says. Harvard Medical School’s Bouatta is more optimistic. He thinks that researchers probably don’t need to invest experimental resources in the types of proteins that AlphaFold does a good job of predicting, which should help structural biologists triage where to put their time and money.
“There are proteins for which AlphaFold is still struggling,” Bouatta agrees. Researchers should spend their capital there, he says. “Maybe if we generate more [experimental] data for those challenging proteins, we could use them for retraining another AI system” that could make even better predictions.
He and colleagues have already reverse engineered AlphaFold to make a version called OpenFold that researchers can train to solve other problems, such as those gnarly but important protein complexes.
Massive amounts of DNA generated by the Human Genome Project have made a wide range of biological discoveries possible and opened up new fields of research (SN: 2/12/22, p. 22). Having structural information on 200 million proteins could be similarly revolutionary, Bouatta says.
In the future, thanks to AlphaFold and its AI kin, he says, “we don’t even know what sorts of questions we might be asking.”
On a Hawaiian mountaintop in the summer of 1992, a pair of scientists spotted a pinprick of light inching through the constellation Pisces. That unassuming object — located over a billion kilometers beyond Neptune — would rewrite our understanding of the solar system.
Rather than an expanse of emptiness, there was something, a vast collection of things in fact, lurking beyond the orbits of the known planets.
The scientists had discovered the Kuiper Belt, a doughnut-shaped swath of frozen objects left over from the formation of the solar system.
As researchers learn more about the Kuiper Belt, the origin and evolution of our solar system is coming into clearer focus. Closeup glimpses of the Kuiper Belt’s frozen worlds have shed light on how planets, including our own, might have formed in the first place. And surveys of this region, which have collectively revealed thousands of such bodies, called Kuiper Belt objects, suggest that the early solar system was home to pinballing planets.
The humble object that kick-started it all is a chunk of ice and rock roughly 250 kilometers in diameter. It was first spotted 30 years ago this month. Staring into space In the late 1980s, planetary scientist David Jewitt and astronomer Jane Luu, both at MIT at the time, were several years into a curious quest. The duo had been using telescopes in Arizona to take images of patches of the night sky with no particular target in mind. “We were literally just staring off into space looking for something,” says Jewitt, now at UCLA.
An apparent mystery motivated the researchers: The inner solar system is relatively crowded with rocky planets, asteroids and comets, but there was seemingly not much out beyond the gas giant planets, besides small, icy Pluto. “Maybe there were things in the outer solar system,” says Luu, who now works at the University of Oslo and Boston University. “It seemed like a worthwhile thing to check out.” Poring over glass photographic plates and digital images of the night sky, Jewitt and Luu looked for objects that moved extremely slowly, a telltale sign of their great distance from Earth. But the pair kept coming up empty. “Years went by, and we didn’t see anything,” Luu says. “There was no guarantee this was going to work out.”
The tide changed in 1992. On the night of August 30, Jewitt and Luu were using a University of Hawaii telescope on the Big Island. They were employing their usual technique for searching for distant objects: Take an image of the night sky, wait an hour or so, take another image of the same patch of sky, and repeat. An object in the outer reaches of the solar system would shift position ever so slightly from one image to the next, primarily because of the movement of Earth in its orbit. “If it’s a real object, it would move systematically at some predicted rate,” Luu says.
By 9:14 p.m. that evening, Jewitt and Luu had collected two images of the same bit of the constellation Pisces. The researchers displayed the images on the bulbous cathode-ray tube monitor of their computer, one after the other, and looked for anything that had moved. One object immediately stood out: A speck of light had shifted just a touch to the west.
But it was too early to celebrate. Spurious signals from high-energy particles zipping through space — cosmic rays — appear in images of the night sky all of the time. The real test would be whether this speck showed up in more than two images, the researchers knew.
Jewitt and Luu nervously waited until 11 p.m. for the telescope’s camera to finish taking a third image. The same object was there, and it had moved a bit farther west. A fourth image, collected just after midnight, revealed the object had shifted position yet again. This is something real, Jewitt remembers thinking. “We were just blown away.” Based on the object’s brightness and its leisurely pace — it would take nearly a month for it to march across the width of the full moon as seen from Earth — Jewitt and Luu did some quick calculations. This thing, whatever it was, was probably about 250 kilometers in diameter. That’s sizable, about one-tenth the width of Pluto. It was orbiting far beyond Neptune. And in all likelihood, it wasn’t alone.
Although Jewitt and Luu had been diligently combing the night sky for years, they had observed only a tiny fraction of it. There were possibly thousands more objects out there like this one just waiting to be found, the two concluded.
The realization that the outer solar system was probably teeming with undiscovered bodies was mind-blowing, Jewitt says. “We expanded the known volume of the solar system enormously.” The object that Jewitt and Luu had found, 1992 QB1 (SN: 9/26/92, p. 196), introduced a whole new realm.
Just a few months later, Jewitt and Luu spotted a second object also orbiting far beyond Neptune (SN: 4/10/93, p. 231). The floodgates opened soon after. “We found 40 or 50 in the next few years,” Jewitt says. As the digital detectors that astronomers used to capture images grew in size and sensitivity, researchers began uncovering droves of additional objects. “So many interesting worlds with interesting stories,” says Mike Brown, an astronomer at Caltech who studies Kuiper Belt objects.
Finding all of these frozen worlds, some orbiting even beyond Pluto, made sense in some ways, Jewitt and Luu realized. Pluto had always been an oddball; it’s a cosmic runt (smaller than Earth’s moon) and looks nothing like its gas giant neighbors. What’s more, its orbit takes it sweeping far above and below the orbits of the other planets. Maybe Pluto belonged not to the world of the planets but to the realm of whatever lay beyond, Jewitt and Luu hypothesized. “We suddenly understood why Pluto was such a weird planet,” Jewitt says. “It’s just one object, maybe the biggest, in a set of bodies that we just stumbled across.” Pluto probably wouldn’t be a member of the planet club much longer, the two predicted. Indeed, by 2006, it was out (SN: 9/2/06, p. 149).
Up-close look The discovery of 1992 QB1 opened the world’s eyes to the Kuiper Belt, named after Dutch-American astronomer Gerard Kuiper. In a twist of history, however, Kuiper predicted that this region of space would be empty. In the 1950s, he proposed that any occupants that might have once existed there would have been banished by gravity to even more distant reaches of the solar system.
In other words, Kuiper anti-predicted the existence of the Kuiper Belt. He turned out to be wrong.
Today, researchers know that the Kuiper Belt stretches from a distance of roughly 30 astronomical units from the sun — around the orbit of Neptune — to roughly 55 astronomical units. It resembles a puffed-up disk, Jewitt says. “Superficially, it looks like a fat doughnut.”
The frozen bodies that populate the Kuiper Belt are the remnants of the swirling maelstrom of gas and dust that birthed the sun and the planets. There’s “a bunch of stuff that’s left over that didn’t quite get built up into planets,” says astronomer Meredith MacGregor of the University of Colorado Boulder. When one of those cosmic leftovers gets kicked into the inner solar system by a gravitational shove from a planet like Neptune and approaches the sun, it turns into an object we recognize as a comet (SN: 9/12/20, p. 14). Comets that circle the sun once only every 200 years or more typically derive from the solar system’s even more distant repository of icy bodies known as the Oort cloud. In scientific parlance, the Kuiper Belt is a debris disk (SN Online: 7/28/21). Distant solar systems contain debris disks, too, scientists have discovered. “They’re absolutely directly analogous to our Kuiper Belt,” MacGregor says.
In 2015, scientists got their first close look at a Kuiper Belt object when NASA’s New Horizons spacecraft flew by Pluto (SN Online: 7/15/15). The pictures that New Horizons returned in the following years were thousands of times more detailed than previous observations of Pluto and its moons. No longer just a few fuzzy pixels, the worlds were revealed as rich landscapes of ice-spewing volcanoes and deep, jagged canyons (SN: 6/22/19, p. 12; SN Online: 7/13/18). “I’m just absolutely ecstatic with what we accomplished at Pluto,” says Marc Buie, an astronomer at the Southwest Research Institute in Boulder, Colo., and a member of the New Horizons team. “It could not possibly have gone any better.”
But New Horizons wasn’t finished with the Kuiper Belt. On New Year’s Day of 2019, when the spacecraft was almost 1.5 billion kilometers beyond Pluto’s orbit, it flew past another Kuiper Belt object. And what a surprise it was. Arrokoth — its name refers to “sky” in the Powhatan/Algonquian language — looks like a pair of pancakes joined at the hip (SN: 12/21/19 & 1/4/20, p. 5; SN: 3/16/19, p. 15). Roughly 35 kilometers long from end to end, it was probably once two separate bodies that gently collided and stuck. Arrokoth’s bizarre structure sheds light on a fundamental question in astronomy: How do gas and dust clump together and grow into larger bodies?
One long-standing theory, called planetesimal accretion, says that a series of collisions is responsible. Tiny bits of material collide and stick together on repeat to build up larger and larger objects, says JJ Kavelaars, an astronomer at the University of Victoria and the National Research Council of Canada. But there’s a problem, Kavelaars says. As objects get large enough to exert a significant gravitational pull, they accelerate as they approach one another. “They hit each other too fast, and they don’t stick together,” he says. It would be unusual for a large object like Arrokoth, particularly with its two-lobed structure, to have formed from a sequence of collisions.
More likely, Arrokoth was born from a process known as gravitational instability, researchers now believe. In that scenario, a clump of material that happens to be denser than its surroundings grows by pulling in gas and dust. This process can form planets on timescales of thousands of years, rather than the millions of years required for planetesimal accretion. “The timescale for planet formation completely changes,” Kavelaars says.
If Arrokoth formed this way, other bodies in the solar system probably did too. That may mean that parts of the solar system formed much more rapidly than previously believed, says Buie, who discovered Arrokoth in 2014. “Already Arrokoth has rewritten the textbooks on how solar system formation works.”
What they’ve seen so far makes scientists even more eager to study another Kuiper Belt object up close. New Horizons is still making its way through the Kuiper Belt, but time is running out to identify a new object and orchestrate a rendezvous. The spacecraft, which is currently 53 astronomical units from the sun, is approaching the Kuiper Belt’s outer edge. Several teams of astronomers are using telescopes around the world to search for new Kuiper Belt objects that would make a close pass to New Horizons. “We are definitely looking,” Buie says. “We would like nothing better than to fly by another object.”
All eyes on the Kuiper Belt Astronomers are also getting a wide-angle view of the Kuiper Belt by surveying it with some of Earth’s largest telescopes. At the Canada-France-Hawaii Telescope on Mauna Kea — the same mountaintop where Jewitt and Luu spotted 1992 QB1 — astronomers recently wrapped up the Outer Solar System Origins Survey. It recorded more than 800 previously unknown Kuiper Belt objects, bringing the total number known to roughly 3,000. This cataloging work is revealing tantalizing patterns in how these bodies move around the sun, MacGregor says. Rather than being uniformly distributed, the orbits of Kuiper Belt objects tend to be clustered in space. That’s a telltale sign that these bodies got a gravitational shove in the past, she says.
The cosmic bullies that did that shoving, most astronomers believe, were none other than the solar system’s gas giants. In the mid-2000s, scientists first proposed that planets like Neptune and Saturn probably pinballed toward and away from the sun early in the solar system’s history (SN: 5/5/12, p. 24). That movement explains the strikingly similar orbits of many Kuiper Belt objects, MacGregor says. “The giant planets stirred up all of the stuff in the outer part of the solar system.”
Refining the solar system’s early history requires observations of even more Kuiper Belt objects, says Meg Schwamb, an astronomer at Queen’s University Belfast in Northern Ireland. Researchers expect that a new astronomical survey, slated to begin next year, will find roughly 40,000 more Kuiper Belt objects. The Vera C. Rubin Observatory, being built in north-central Chile, will use its 3,200-megapixel camera to repeatedly photograph the entire Southern Hemisphere sky every few nights for 10 years. That undertaking, the Legacy Survey of Space and Time, or LSST, will revolutionize our understanding of how the early solar system evolved, says Schwamb, a cochair of the LSST Solar System Science Collaboration. It’s exciting to think about what we might learn next from the Kuiper Belt, Jewitt says. The discoveries that lay ahead will be possible, in large part, because of advances in technology, he says. “One picture with one of the modern survey cameras is roughly a thousand pictures with our setup back in 1992.”
But even as we uncover more about this distant realm of the solar system, a bit of awe should always remain, Jewitt says. “It’s the largest piece of the solar system that we’ve yet observed.”
Even spiders, it seems, have fallen victim to misinformation.
Media reports about people’s encounters with spiders tend to be full of falsehoods with a distinctly negative spin. An analysis of a decade’s worth of newspaper stories from dozens of countries finds that nearly half of the reports contain errors, arachnologist Catherine Scott and colleagues report August 22 in Current Biology.
“The vast majority of the spider content out there is about them being scary and hurting people,” says Scott, of McGill University in Montreal. In reality, they note, “spiders almost never bite people.” Of the roughly 50,000 known spider species, vanishingly few are dangerous. Instead, many spiders benefit us by eating insects like mosquitoes that are harmful to people. Even with the rare exceptions like brown recluse and black widow spiders, bites are extremely uncommon, Scott says. Some stories about bites blamed spiders that don’t occur in the area, and others reported symptoms that don’t match symptoms of actual bites. “So many stories about spider bites included no evidence whatsoever that there was any spider involved,” they say.
To conduct the study, Scott and their colleagues analyzed over 5,000 online newspaper stories about humans and spiders from 2010 to 2020 across 81 countries. In addition to errors, the team determined that 43 percent of the stories were sensationalized, often using words like nasty, killer, agony and nightmare. International and national newspapers were more likely to sensationalize spiders than regional outlets. Stories that included a spider expert were less sensationalistic, though there was no such effect from other experts, including doctors.
If people knew the truth about spiders, they could spend less time blaming them for bites and killing them with pesticides that are toxic to many other species, including humans, Scott says. Clearing up the misinformation would be good for spiders, too — especially the one in your house that doesn’t get squashed out of fear. Spiders in general stand to benefit, the researchers conclude, because news helps shape public opinion, which can influence decisions about wildlife conservation.
“Spiders are kind of unique in that they seem to be really good at capturing people’s attention,” says arachnologist Lisa Taylor at the University of Florida in Gainesville, who was not involved in the study. “If that attention is paired with real information about how fascinating they are, rather than sensationalistic misinformation, then I think spiders are well-suited to serve as tiny ambassadors for wildlife in general.”
Sea urchin skeletons may owe some of their strength to a common geometric design.
Components of the skeletons of common sea urchins (Paracentrotus lividus) follow a similar pattern to that found in honeycombs and dragonfly wings, researchers report in the August Journal of the Royal Society Interface. Studying this recurring natural order could inspire the creation of strong yet lightweight new materials.
Urchin skeletons display “an incredible diversity of structures at the microscale, varying from fully ordered to entirely chaotic,” says marine biologist and biomimetic consultant Valentina Perricone. These structures may help the animals maintain their shape when faced with predator attacks and environmental stresses.
While using a scanning electron microscope to study urchin skeleton tubercules — sites where the spines attach that withstand strong mechanical forces — Perricone spotted “a curious regularity.” Tubercules seem to follow a type of common natural order called a Voronoi pattern, she and her colleagues found. Using math, a Voronoi pattern is created by a process that divides a region into polygon-shaped cells that are built around points within them called seeds (SN: 9/23/18). The cells follow the nearest neighbor rule: Every spot inside a cell is nearer to that cell’s seed than to any other seed. Also, the boundary that separates two cells is equidistant from both their seeds.
A computer-generated Voronoi pattern had an 82 percent match with the pattern found in sea urchin skeletons. This arrangement, the team suspects, yields a strong yet lightweight skeletal structure. The pattern “can be interpreted as an evolutionary solution” that “optimizes the skeleton,” says Perricone, of the University of Campania “Luigi Vanvitelli” in Aversa, Italy.
Urchins, dragonflies and bees aren’t the only beneficiaries of Voronoi architecture. “We are developing a library of bioinspired, Voronoi-based structures” that could “serve as lightweight and resistant solutions” for materials design, Perricone says. These, she hopes, could inspire new developments in materials science, aerospace, architecture and construction.
When Herminia Pasantes Ordóñez was about 14 years old, in 1950, she heard her mother tell her father that she would never find a husband. Pasantes had to wear thick glasses for her poor eyesight. In her mother’s eyes, those glasses meant her future as a “good woman” was doomed. “This made my life easier,” says Pasantes, “because it was already said that I was going to study.”
At a time when it was uncommon for women to become scientists, Pasantes studied biology at the National Autonomous University of Mexico in Mexico City, or UNAM. She was the first member of her family to go to college. She became a neurobiologist and one of the most important Mexican scientists of her time. Her studies on the role of the chemical taurine in the brain offer deep insights into how cells maintain their size — essential to proper functioning. In 2001, she became the first woman to earn Mexico’s National Prize for Sciences and Arts in the area of physical, mathematical and natural sciences.
“We basically learned about cell volume regulation through the eyes and work of Herminia,” says Alexander Mongin, a Belarusian neuroscientist at Albany Medical College in New York.
Pasantes did get married, in 1965 while doing her master’s in biochemistry at UNAM. She had a daughter in 1966 and a son in 1967 before starting a Ph.D. in natural sciences in 1970 at the Center for Neurochemistry at the University of Strasbourg in France. There, she worked in the laboratory of Paul Mandel, a Polish pioneer in neurochemistry.
The lab was trying to find out everything there was to know about the retina, the layer of tissue at the back of the eye that is sensitive to light. Pasantes decided to test whether free amino acids, a group that aren’t incorporated into proteins, were present in the retinas and brain of mice. Her first chromatography — a lab technique that lets scientists separate and identify the components of a sample — showed an immense amount of taurine in both tissues. Taurine would drive the rest of her scientific career, including work in her own lab, which she started around 1975 at the Institute of Cellular Physiology at UNAM.
Taurine turns out to be widely distributed in animal tissues and has diverse biological functions, some of which were discovered by Pasantes. Her research found that taurine helps maintain cell volume in nerve cells, and that it protects brain, muscle, heart and retinal cells by preventing the death of stem cells, which give rise to all specialized cells in the body. Contrary to what most scientists had believed at the time, taurine didn’t work as a neurotransmitter sending messages between nerve cells. Pasantes demonstrated for the first time that it worked as an osmolyte in the brain. Osmolytes help maintain the size and integrity of cells by opening up channels in their membranes to get water in or out.
Pasantes says she spent many years looking for an answer for why there is so much taurine in the brain. “When you ask nature a question, 80 to 90 percent of the time, it responds no,” she says. “But when it answers yes, it’s wonderful.”
Pasantes’ lab was one of the big four labs that did groundbreaking work on cell volume regulation in the brain, says Mongin.
Her work and that of others proved taurine has a protective effect; it’s the reason the chemical is today sprinkled in the containers that carry organs for transplants. Pasantes’ work was the foundation for our understanding of how to prevent and treat brain edema, a condition where the brain swells due to excessive accumulation of fluid, from head trauma or reduced blood supply, for example. She and other experts also reviewed the role of taurine for Red Bull, which added the chemical to its formula because of potentially protective effects in the heart.
Pasantes stopped doing research in 2019 and spends her time talking and writing about science. She hopes her story speaks to women around the world who wish to be scientists: “It is important to send the message that it is possible,” she says.
Years before she was accepted into Mandel’s lab, her application to a Ph.D. in biochemistry at the UNAM was rejected. Pasantes says the reason was that she had just had her daughter. Looking back, this moment was “one of the most wonderful things that could’ve happened to me,” Pasantes says, because she ended up in Strasbourg, where her potential as a researcher bloomed.
Rosa María González Victoria, a social scientist at the Autonomous University of the State of Hidalgo in Pachuca, Mexico, who specializes in gender studies, recently interviewed Pasantes for a book about Mexican women in science. González Victoria thinks Pasantes’ response to that early rejection speaks to the kind of person she is: “A woman that takes those no’s and turns them into yes’s.”