image
The Noah's Ark Problem: figuring out which species to conserve with limited resources. JoeLena/Getty Images

The annual United Nations Climate Change Conference, better known as COP, that starts Nov. 30 in the United Arab Emirates will bring together governments, businesses, international organizations and NGOs to shine a spotlight on the climate emergency the world faces and consider solutions to the crisis. The alarming rates at which we are losing species is not just a tragedy of epic proportions – the destruction of biodiversity also robs humanity of one of its strongest defenses against climate change.

Retaining the earth’s diverse mix of animals and plants is crucial for the planet’s future, yet any plan to halt its loss must grapple with the reality that not every species can be saved from extinction because of the limited resources we have for biodiversity conservation. By one estimate, about US$598 billion to $824 billion is needed annually to reverse the loss of species worldwide.

Different ways of posing the problem

Given finite research and practical resources, how should we act to conserve biological diversity? Should we, as I have argued in my research as an expert in environmental economics, try to regulate the rate at which habitat is being converted from natural to human-centered uses?

An alternative approach concentrates on conserving what biologists call keystone species that play a critical role in holding the ecosystem together. An example is the gray wolf in Yellowstone National Park, whose presence regulates prey populations like elk and deer, which in turn have cascading effects on vegetation and the overall ecosystem structure and function.

The Bible suggests a contrasting approach in the Lord’s dictum to Noah before the great flood: “Of fowls after their kind, and of cattle after their kind, of every creeping thing of the earth after his kind, two of every sort shall come unto thee, to keep them alive.”

A solution

One of the most original and interesting answers to this question was provided by the late Harvard economist Martin Weitzman, who applied economic analysis to address the conservation of endangered species. In a pioneering 1998 paper titled The Noah’s Ark Problem, Weitzman viewed the challenge of figuring out which species to conserve with limited resources as a modern-day equivalent of the problem the biblical patriarch Noah faced when trying to determine what to take with him – and hence save – on his ark.

The late economist Martin Weitzman giving a talk.
Martin Weitzman’s research looked at the challenge of figuring out which endangered species to conserve with limited resources. Wikimedia Commons, CC BY-SA

In Weitzman’s view, biodiversity gives rise to two kinds of values. The first is utility to humans – insects pollinate crops that yield food, and so on. There is no serious dispute that biodiversity – the variety of living species on Earth, including plants, animals, bacteria and fungi – benefits humans.

As the World Health Organization puts it, “Healthy communities rely on well-functioning ecosystems. They provide clean air, fresh water, medicines and food security. They also limit disease and stabilize the climate.” Yet nearly a third of all monitored species are currently endangered because of human activities.

The second kind identified by Weitzman is the inherent value of the wide variety of species and the genetic information they contain to biological diversity itself. Biodiversity plays a crucial role in maintaining the stability and resilience of ecosystems.

For example, increased genetic variation is important to wild Alaskan salmon returning to natal streams and rivers to reproduce. Populations in different streams have developed different sets of genetic information; some of these will allow for the earlier migration in streams that will be needed under warming temperatures and earlier snowmelt.

Weitzman likens the task of preserving different species to the task of saving the volumes in a library that represent an accumulation of human knowledge.

While in principle, every volume in the library might be valuable, some may have information that is also available in other libraries. Therefore, the objective would be to save those volumes that have information in them that is not contained anywhere else. According to this view, a conservationist’s goal ought to be to save as much of this genetic information as possible, even if the species concerned provide little direct value to humans.

This line of thinking provides counterintuitive guidance to conservationists. Specifically, it suggests that the best way to conserve biodiversity in an uncertain and resource-constrained world is to pick a species and then save as many members of this species as possible. By following this aggressive or “extreme policy,” the conservationist preserves not only what is informationally distinct about this species but also all the information it shares with other species.

Bumblebees on a yellow flower collect pollen.
Bumblebees on a yellow flower collect pollen. nnorozoff/Getty Images

An example

To see this, imagine that there are two libraries that have many volumes (or species members), some unique to each library and some overlapping. If Library 1 burns to the ground, we lose all of the volumes (species members) with the exception of those that are also housed in Library 2. The same is true if Library 2 burns.

If both libraries burn, all is lost. If both are on fire, and we do not have the equipment to save both, and one library takes fewer resources to save, we may be better off using our scarce resources to protect that one and letting the other one go in order to preserve the unique volumes (species members) as well as the knowledge in the overlapping volumes.

What does it mean in practice?

The practical meaning is that – when forced to choose – it may not make much sense to use limited conservation funds to protect a highly endangered species such as cuddly pandas that are very expensive to protect. We may be better off protecting, for example, the Atlantic menhaden, or pogy, a primary food source for bigger fish and birds along the Eastern Seaboard and a vital connection between the bottom and top of the food chain. A current lawsuit claims it is subject to overfishing in and around the Chesapeake Bay.

Weitzman’s Noah’s Ark model seeks to provide useful guidance in determining how to prioritize our efforts to save endangered species, with the presumption that biodiversity is both of value to humans and that it is inherently valuable. While we lack the resources to save every at-risk species from extinction, further delay in dealing with the climate emergency and its harmful effects on the loss of species is one thing the world cannot afford.

The Conversation

Amitrajeet A. Batabyal does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Read more …Resources to save 'every creeping thing of the earth' are limited. What would Noah do?

image
Acapulco's beachfront condo towers were devastated by Hurricane Otis. Rodrigo Oropeza/AFP via Getty Images

Acapulco wasn’t prepared when Hurricane Otis struck as a powerful Category 5 storm on Oct. 25, 2023. The short notice as the storm rapidly intensified over the Pacific Ocean wasn’t the only problem – the Mexican resort city’s buildings weren’t designed to handle anything close to Otis’ 165 mph winds.

While Acapulco’s oceanfront high-rises were built to withstand the region’s powerful earthquakes, they had a weakness.

Since powerful hurricanes are rare in Acapulco, Mexico’s building codes didn’t require that their exterior materials be able to hold up to extreme winds. In fact, those materials were often kept light to help meet earthquake building standards.

Otis’ powerful winds ripped off exterior cladding and shattered windows, exposing bedrooms and offices to the wind and rain. The storm took dozens of lives and caused billions of dollars in damage.

A large glass tower with sloping sides, like a sliced egg, reflects the sunrise with the Pacific Ocean looking placid in the background.
A US$130 million luxury condo building on the beach in Acapulco before Hurricane Otis struck on Oct. 25, 2023. Hamid Arabzadeh, PhD., P.Eng.
A stormy sky shows through the floors that were once apartments.
The same Acapulco condo tower after Hurricane Otis. Hamid Arabzadeh, PhD., P.Eng.

I have worked on engineering strategies to enhance disaster resilience for over three decades and recently wrote a book, “The Blessings of Disaster,” about the gambles humans take with disaster risk and how to increase resilience. Otis provided a powerful example of one such gamble that exists when building codes rely on probabilities that certain hazards will occur based on recorded history, rather than considering the severe consequences of storms that can devastate entire cities.

The fatal flaw in building codes

Building codes typically provide “probabilistic-based” maps that specify wind speeds that engineers must consider when designing buildings.

The problem with that approach lies in the fact that “probabilities” are simply the odds that extreme events of a certain size will occur in the future, mostly calculated based on past occurrences. Some models may include additional considerations, but these are still typically anchored in known experience.

This is all good science. Nobody argues with that. It allows engineers to design structures in accordance with a consensus on what are deemed acceptable return periods for various hazards, referring to the likelihood of those disasters occurring. Return periods are a somewhat arbitrary assessment of what is a reasonable balance between minimizing risk and keeping building costs reasonable.

However, probabilistic maps only capture the odds of the hazard occurring. A probabilistic map might specify a wind speed to consider for design, irrespective of whether that given location is a small town with a few hotels or a megapolis with high-rises and complex urban infrastructure. In other words, probabilistic maps do not consider the consequences when an extreme hazard exceeds the specified value and “all hell breaks loose.”

How probability left Acapulco exposed

According to the Mexican building code, hotels, condos and other commercial and office buildings in Acapulco must be designed to resist 88 mph winds, corresponding to the strongest wind likely to occur on average once every 50 years there. That’s a Category 1 storm.

A 200-year return period for wind is used for essential facilities, such as hospital and school buildings, corresponding to 118 mph winds. But over a building’s life span of, say, 50 years, that still leaves a 22% change that winds exceeding 118 mph will occur (yes, the world of statistics is that sneaky).

A map of the Mexico area with lots of storm tracks offshore and a few crossing land in the southern part of the country.
Mexico’s hurricane history in storm tracks. NOAA
A map of the Acapulco area with lots of storm tracks offshore and a few crossing land.
A century of hurricane storm tracks near Acapulco show several offshore storms that brought strong winds and rain to the city, but few direct landfalls. Acapulco Bay is in the center of the map on the coast. Red, pink and purple lines are categories 3, 4 and 5, respectively. NOAA

The probability wind maps for both return periods show Acapulco experiences lower average wind speeds than much of the 400 miles of Mexican coast north of the city. Yet, Acapulco is a major city, with a metropolitan population of over 1 million. It also has more than 50 buildings taller than 20 stories, according to the SkyscraperPage, a database of skyscrapers, and it is the only city with buildings that tall along that stretch of the Pacific coast.

Designing for a 50-year return period in this case is questionable, as it implies a near 100% chance of encountering wind exceeding this design value for a building with a 50-year life span or greater.

Florida faces similiar challenges

The shortcomings of probabilistic-based maps that specify wind speeds have also been observed in the United States. For example, new buildings along most of Florida’s coast must be able to resist 140 mph winds or greater, but there are a few exceptions. One is the Big Bend area where Hurricane Idalia made landfall in 2023. Its design wind speed is about 120 mph instead.

A 2023 update to the Florida Building Code raised the minimum wind speed to approximately 140 mph in Mexico Beach, the Panhandle town that was devastated by Hurricane Michael in 2018. The Big Bend exception may be the next one to be eliminated.

Acapulco’s earthquake design weakness

A saving grace for Acapulco is that it is located in one of Mexico’s most active seismic risk zones – for example, a magnitude 7 earthquake struck nearby in 2021. As a result, the lateral-load-resisting structural systems in tall buildings there are designed to resist seismic forces that are generally larger than hurricane forces.

However, a drawback is that the larger the mass of a building, the larger the seismic forces the building must be designed to resist. Consequently, light materials were typically used for the cladding – the exterior surface of the building that protects it against the weather – because that translates into lower seismic forces. This light cladding was not able to withstand hurricane-force winds.

Had the cladding not failed, the full wind forces would have been transferred to the structural system, and the buildings would have survived with little or no damage.

A ‘good engineering approach’ to hazards

A better building code could go one step beyond “good science” probabilistic maps and adopt a “good engineering approach” by taking stock of the consequences of extreme events occurring, not just the odds that they will.

In Florida, the incremental cost of designing for wind speeds of 140 mph rather than 120 mph is marginal compared to total building cost, given that cladding able to resist more than 140 mph is already used in nearly all of the state. In Acapulco, with the spine of buildings already able to resist earthquake forces much larger than hurricane forces, designing cladding that can withstand stronger hurricane-level forces is likely to be an even smaller percentage of total project cost.

Someday, the way that design codes deal with extreme events such as hurricanes, not only in Mexico, will hopefully evolve to more broadly account for what is at risk at the urban scale. Unfortunately, as I explain in “The Blessings of Disaster,” we will see more extreme disasters before society truly becomes disaster resilient.

The Conversation

Michel Bruneau does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Read more …Acapulco was built to withstand earthquakes, but not Hurricane Otis' destructive winds – how...

image
Storm Ciarán pounded England's Newhaven Lighthouse and harbor wall on Nov. 4, 2023. AP Photo/Kin Cheung

As oceans waves rise and fall, they apply forces to the sea floor below and generate seismic waves. These seismic waves are so powerful and widespread that they show up as a steady thrum on seismographs, the same instruments used to monitor and study earthquakes.

That wave signal has been getting more intense in recent decades, reflecting increasingly stormy seas and higher ocean swell.

In a new study in the journal Nature Communications, colleagues and I tracked that increase around the world over the past four decades. These global data, along with other ocean, satellite and regional seismic studies, show a decadeslong increase in wave energy that coincides with increasing storminess attributed to rising global temperatures.

What seismology has to do with ocean waves

Global seismographic networks are best known for monitoring and studying earthquakes and for allowing scientists to create images of the planet’s deep interior.

These highly sensitive instruments continuously record an enormous variety of natural and human-caused seismic phenomena, including volcanic eruptions, nuclear and other explosions, meteor strikes, landslides and glacier-quakes. They also capture persistent seismic signals from wind, water and human activity. For example, seismographic networks observed the global quieting in human-caused seismic noise as lockdown measures were instituted around the world during the coronavirus pandemic.

However, the most globally pervasive of seismic background signals is the incessant thrum created by storm-driven ocean waves referred to as the global microseism.

Two types of seismic signals

Ocean waves generate microseismic signals in two different ways.

The most energetic of the two, known as the secondary microseism, throbs at a period between about eight and 14 seconds. As sets of waves travel across the oceans in various directions, they interfere with one another, creating pressure variation on the sea floor. However, interfering waves aren’t always present, so in this sense, it is an imperfect proxy for overall ocean wave activity.

A second way in which ocean waves generate global seismic signals is called the primary microseism process. These signals are caused by traveling ocean waves directly pushing and pulling on the seafloor. Since water motions within waves fall off rapidly with depth, this occurs in regions where water depths are less than about 1,000 feet (about 300 meters). The primary microseism signal is visible in seismic data as a steady hum with a period between 14 and 20 seconds.

What the shaking planet tells us

In our study, we estimated and analyzed historical primary microseism intensity back to the late 1980s at 52 seismograph sites around the world with long histories of continuous recording.

We found that 41 (79%) of these stations showed highly significant and progressive increases in energy over the decades.

The results indicate that globally averaged ocean wave energy since the late 20th century has increased at a median rate of 0.27% per year. However, since 2000, that globally averaged increase in the rate has risen by 0.35% per year.

image
Ocean wave intensification since the late 1980s: Each circle is a seismic station, with size proportional to the vertical acceleration of the Earth at that station smoothed over three years. Red circles indicate periods when ground motions are larger than the historical median; blue indicate periods when they are smaller. The synchronized graph shows the median vertical acceleration anomaly for all stations and reflects El Niño cycles and a more pronounced increase in recent years. Source: Rick Aster

We found the greatest overall microseism energy in the very stormy Southern Ocean regions near the Antarctica peninsula. But these results show that North Atlantic waves have intensified the fastest in recent decades compared to historical levels. That is consistent with recent research suggesting North Atlantic storm intensity and coastal hazards are increasing. Storm Ciarán, which hit Europe with powerful waves and hurricane-force winds in November 2023, was one record-breaking example.

The decadeslong microseism record also shows the seasonal swing of strong winter storms between the Northern and Southern hemispheres. It captures the wave-dampening effects of growing and shrinking Antarctic sea ice, as well as the multi-year highs and lows associated with El Niño and La Niña cycles and their long-range effects on ocean waves and storms.

Homes hang over the edge of a cliff above an ocean beach.
In November 2022, Hurricane Nicole’s intense waves eroded the land beneath several homes in Daytona Beach, Fla. AP Photo/Rebecca Blackwell

Together, these and other recent seismic studies complement the results from climate and ocean research showing that storms, and waves, are intensifying as the climate warms.

A coastal warning

The oceans have absorbed about 90% of the excess heat connected to rising greenhouse gas emissions from human activities in recent decades. That excess energy can translate into more damaging waves and more powerful storms.

Our results offer another warning for coastal communities, where increasing ocean wave heights can pound coastlines, damaging infrastructure and eroding the land. The impacts of increasing wave energy are further compounded by ongoing sea level rise fueled by climate change and by subsidence. And they emphasize the importance of mitigating climate change and building resilience into coastal infrastructure and environmental protection strategies.

The Conversation

Richard Aster receives funding from the U.S. National Science Foundation.

Read more …How global warming shakes the Earth: Seismic data show ocean waves gaining strength as the planet...

image
Gas stoves without adequate ventilation can produce harmful concentrations of nitrogen dioxide. Sjoerd van der Wal/Getty Images

In 1976, beloved chef, cookbook author and television personality Julia Child returned to WGBH-TV’s studios in Boston for a new cooking show, “Julia Child & Company,” following her hit series “The French Chef.” Viewers probably didn’t know that Child’s new and improved kitchen studio, outfitted with gas stoves, was paid for by the American Gas Association.

While this may seem like any corporate sponsorship, we now know it was a part of a calculated campaign by gas industry executives to increase use of gas stoves across the United States. And stoves weren’t the only objective. The gas industry wanted to grow its residential market, and homes that used gas for cooking were likely also to use it for heat and hot water.

The industry’s efforts went well beyond careful product placement, according to new research from the nonprofit Climate Investigations Center, which analyzes corporate efforts to undermine climate science and slow the ongoing transition away from fossil fuels. As the center’s study and a National Public Radio investigation show, when evidence emerged in the early 1970s about the health effects of indoor nitrogen dioxide exposure from gas stove use, the American Gas Association launched a campaign designed to manufacture doubt about the existing science.

As a researcher who has studied air pollution for many years – including gas stoves’ contribution to indoor air pollution and health effects – I am not naïve about the strategies that some industries use to avoid or delay regulations. But I was surprised to learn that the multipronged strategy related to gas stoves directly mirrored tactics that the tobacco industry used to undermine and distort scientific evidence of health risks associated with smoking starting in the 1950s.

The gas industry is defending natural gas stoves, which are under fire for their health effects and their contribution to climate change.

Manufacturing controversy

The gas industry relied on Hill & Knowlton, the same public relations company that masterminded the tobacco industry’s playbook for responding to research linking smoking to lung cancer. Hill & Knowlton’s tactics included sponsoring research that would counter findings about gas stoves published in the scientific literature, emphasizing uncertainty in these findings to construct artificial controversy and engaging in aggressive public relations efforts.

For example, the gas industry obtained and reanalyzed the data from an EPA study on Long Island that showed more respiratory problems in homes with gas stoves. Their reanalysis concluded that there were no significant differences in respiratory outcomes.

The industry also funded its own health studies in the early 1970s, which confirmed large differences in nitrogen dioxide exposures but did not show significant differences in respiratory outcomes. These findings were documented in publications where industry funding was not disclosed. These conclusions were amplified in numerous meetings and conferences and ultimately influenced major governmental reports summarizing the state of the literature.

This campaign was remarkable, since the basics of how gas stoves affected indoor air pollution and respiratory health were straightforward and well established at the time. Burning fuel, including natural gas, generates nitrogen oxides: The air in Earth’s atmosphere is about 78% nitrogen and 21% oxygen, and these gases react at high temperatures.

Nitrogen dioxide is known to adversely affect respiratory health. Inhaling it causes respiratory irritation and can worsen diseases such as asthma. This is a key reason why the U.S. Environmental Protection Agency established an outdoor air quality standard for nitrogen dioxide in 1971.

No such standards exist for indoor air, but as the EPA now acknowledges, nitrogen dioxide exposure indoors also is harmful.

Infographic about nitrogen dioxide as an asthma trigger
More than 27 million people in the U.S. have asthma, including about 4.5 million children under age 18. Non-Hispanic Black children are two times more likely to have asthma compared with non-Hispanic white children. EPA

How harmful is indoor exposure?

The key question is whether nitrogen dioxide exposure related to gas stoves is large enough to lead to health concerns. While levels vary across homes, scientific research shows that the simple answer is yes – especially in smaller homes and when ventilation is inadequate.

This has been known for a long time. For example, a 1998 study that I co-authored showed that the presence of gas stoves was the strongest predictor of personal exposure to nitrogen dioxide. And work dating back to the 1970s showed that indoor nitrogen dioxide levels in the presence of gas stoves could be far higher than outdoor levels. Depending on ventilation levels, concentrations could reach levels known to contribute to health risks.

Despite this evidence, the gas industry’s campaign was largely successful. Industry-funded studies successfully muddied the waters, as I have seen over the course of my research career, and stalled further federal investigations or regulations addressing gas stove safety.

This issue took on new life at the end of 2022, when researchers published a new study estimating that 12.7% of U.S. cases of childhood asthma – about one case in eight – were attributable to gas stoves. The industry continues to cast doubt on gas stoves’ contribution to health effects and fund pro-gas stove media campaigns.

A concern for climate and health

Residential gas use is also controversial today because it slows the ongoing shift toward renewable energy, at a time when the impacts of climate change are becoming alarmingly clear. Some cities have already moved or are considering steps to ban gas stoves in new construction and shift toward electrifying buildings.

As communities wrestle with these questions, regulators, politicians and consumers need accurate information about the risks of gas stoves and other products in homes. There is room for vigorous debate that considers a range of evidence, but I believe that everyone has a right to know where that evidence comes from.

The commercial interests of many industries, including alcohol, tobacco and fossil fuels, aren’t always compatible with the public interest or human health. In my view, exposing the tactics that vested interests use to manipulate the public can make consumers and regulators savvier and help deter other industries from using their playbook.

The Conversation

Jonathan Levy has received funding from the National Institutes of Health, the U.S. Environmental Protection Agency, the U.S. Department of Housing and Urban Development, and the Health Effects Institute for studies on the contribution of outdoor and indoor sources to air pollution levels in homes.

Read more …When science showed in the 1970s that gas stoves produced harmful indoor air pollution, the...

More Articles …