As temperatures soared to 128 degrees, Death Valley smashed heat records in June

Lexington  Herald –  Leader

As temperatures soared to 128 degrees, Death Valley smashed heat records in June

The hottest place on Earth had its warmest June on record this year.

Death Valley National Park recorded an average temperature of 102.9 degrees in June, according to the National Park Service. That’s nearly 8 degrees hotter than what’s typical.

On June 17, it reached an even hotter peak.

“The heat wave that affected much of the West in mid-June peaked at 128 degrees in Death Valley on June 17, which broke the daily record by 6 degrees,” the National Park Service said Friday in a news release. “Seven days in the month set new daily records for high temperatures.”

Even the lowest temperature at the park that month was still above 100 degrees. At 3 a.m. on June 29, the temperature dropped to 104 degrees.

Last summer was also a hot one for Death Valley, McClatchy News reported. From June through August in 2020 — the meteorological summer — Death Valley had an average temperature of 102.7, according to the National Park Service.

It was the fourth hottest summer on record, following 2019, 2017 and 2016.

The park, which sits on the California-Nevada border, usually averages 18 days that hit 120 degrees or more, officials said.

“Death Valley’s dramatic landscape ranges from 282 feet below sea level to 11,049 feet above,” the National Park Service said. “Clear, dry air, and minimal plant coverage means there’s little to block the sun from heating up the ground. Heat radiates from the ground back into the air.”

Hot air in the park rises and gets trapped by the surrounding mountains. Then it recirculates to the valley floor and the heating cycle continues, park officials said.

“The park’s extreme heat attracts people seeking to experience a temperature hotter than they ever have before,” park officials said. “Park rangers say it is possible to visit Death Valley safely in the summer. Limit heat exposure by not walking more than 5 minutes from an air-conditioned vehicle.”

Death Valley isn’t the only place experiencing record-breaking heat recently.

Many parts of the West have shattered heat records, and temperatures have soared above 100 degrees for days on end.

In Portland, temperatures reached 112 degrees Sunday, breaking the record-high of 108 degrees that was set the day before and the region’s all-time high since 1940, according to the National Weather Service.

Temperatures in Seattle also reached an all-time high of 104 degrees, the first time temperatures were above 100 degrees for two consecutive days in the region.

Water crisis reaches boiling point on Oregon-California line

Water crisis reaches boiling point on Oregon-California line

 

TULELAKE, Calif. (AP) — Ben DuVal knelt in a barren field near the California-Oregon state line and scooped up a handful of parched soil as dust devils whirled around him and birds flitted between empty irrigation pipes.

DuVal’s family has farmed the land for three generations, and this summer, for the first time ever, he and hundreds of others who rely on irrigation from a depleted, federally managed lake aren’t getting any water from it at all.

As farmland goes fallow, Native American tribes along the 257-mile (407-kilometer) long river that flows from the lake to the Pacific Ocean watch helplessly as fish that are inextricable from their diet and culture die in droves or fail to spawn in shallow water.

Just a few weeks into summer, a historic drought and its on-the-ground consequences are tearing communities apart in this diverse basin filled with flat vistas of sprawling alfalfa and potato fields, teeming wetlands and steep canyons of old-growth forests.

Competition over the water from the river has always been intense. But this summer there is simply not enough, and the farmers, tribes and wildlife refuges that have long competed for every drop now face a bleak and uncertain future together.

“Everybody depends on the water in the Klamath River for their livelihood. That’s the blood that ties us all together. … They want to have the opportunity to teach their kids to fish for salmon just like I want to have the opportunity to teach my kids how to farm,” DuVal said of the downriver Yurok and Karuk tribes. “Nobody’s coming out ahead this year. Nobody’s winning.”

With the decades long conflict over water rights reaching a boiling point, those living the nightmare worry the Klamath Basin’s unprecedented drought is a harbinger as global warming accelerates.

“For me, for my family, we see this as a direct result of climate change,” said Frankie Myers, vice chairman of the Yurok Tribe, which is monitoring a massive fish kill where the river enters the ocean. “The system is crashing, not just for Yurok people … but for people up and down the Klamath Basin, and it’s heartbreaking.”

ROOTS OF A CRISIS

Twenty years ago, when water feeding the farms was drastically reduced amid another drought, the crisis became a national rallying cry for the political right, and some protesters breached a fence and opened the main irrigation canal in violation of federal orders.

But today, as reality sinks in, many irrigators reject the presence of anti-government activists who have once again set up camp. In the aftermath of the Jan. 6 insurrection at the U.S. Capitol, irrigators who are at risk of losing their farms and in need of federal assistance fear any ties to far-right activism could taint their image.

Some farmers are getting some groundwater from wells, blunting their losses, and a small number who get flows from another river will have severely reduced water for just part of the summer. Everyone is sharing what water they have.

“It’s going to be people on the ground, working together, that’s going to solve this issue,” said DuVal, president of the Klamath Water Users Association. “What can we live with, what can those parties live with, to avoid these train wrecks that seem to be happening all too frequently?”

Meanwhile, toxic algae is blooming in the basin’s main lake — vital habitat for endangered suckerfish — a month earlier than normal, and two national wildlife refuges that are a linchpin for migratory birds on the Pacific Flyway are drying out. Environmentalists and farmers are using pumps to combine water from two stagnant wetlands into one deeper to prevent another outbreak of avian botulism like the one that killed 50,000 ducks last summer.

The activity has exposed acres of arid, cracked landscape that likely hasn’t been above water for thousands of years.

“There’s water allocated that doesn’t even exist. This is all unprecedented. Where do you go from here? When do you start having the larger conversation of complete unsustainability?” said Jamie Holt, lead fisheries technician for the Yurok Tribe, who counts dead juvenile chinook salmon every day on the lower Klamath River.

“When I first started this job 23 years ago, extinction was never a part of the conversation,” she said of the salmon. “If we have another year like we’re seeing now, extinction is what we’re talking about.”

The extreme drought has exacerbated a water conflict that traces its roots back more than a century.

Beginning in 1906, the federal government reengineered a complex system of lakes, wetlands and rivers in the 10 million-acre (4 million-hectare) Klamath River Basin to create fertile farmland. It built dikes and dams to block and divert rivers, redirecting water away from a natural lake spanning the California-Oregon border.

Evaporation then reduced the lake to one-quarter of its former size and created thousands of arable acres in an area that had been underwater for millennia.

In 1918, the U.S. began granting homesteads on the dried-up parts of Tule Lake. Preference was given to World War I and World War II veterans, and the Klamath Reclamation Project quickly became an agricultural powerhouse. Today, farmers there grow everything from mint to alfalfa to potatoes that go to In ‘N Out Burger, Frito-Lay and Kettle Foods.

Water draining off the fields flowed into national wildlife refuges that continue to provide respite each year for tens of thousands of birds. Within the altered ecosystem, the refuges comprise a picturesque wetland oasis nicknamed the Everglades of the West that teems with white pelicans, grebes, herons, bald eagles, blackbirds and terns.

Last year, amid a growing drought, the refuges got little water from the irrigation project. This summer, they will get none.

SPEAKING FOR THE FISH

While in better water years, the project provided some conservation for birds, it did not do the same for fish — or for the tribes that live along the river.

The farmers draw their water from the 96-square-mile (248-square-kilometer) Upper Klamath Lake, which is also home to suckerfish. The fish are central to the Klamath Tribes’ culture and creation stories and were for millennia a critical food source in a harsh landscape.

In 1988, two years after the tribe regained federal recognition, the U.S. Fish and Wildlife Service listed two species of suckerfish that spawn in the lake and its tributaries as endangered. The federal government must keep the extremely shallow lake at a minimum depth for spawning in the spring and to keep the fish alive in the fall when toxic algae blooms suck out oxygen.

This year, amid exceptional drought, there was not enough water to ensure those levels and supply irrigators. Even with the irrigation shutoff, the lake’s water has fallen below the mandated levels — so low that some suckerfish were unable to reproduce, said Alex Gonyaw, senior fish biologist for the Klamath Tribes.

The youngest suckerfish in the lake are now nearly 30 years old, and the tribe’s projections show both species could disappear within the next few decades. It says even when the fish can spawn, the babies die because of low water levels and a lack of oxygen. The tribe is now raising them in captivity and has committed to “speak for the fish” amid the profound water shortage.

“I don’t think any of our leaders, when they signed the treaties, thought that we’d wind up in a place like this. We thought we’d have the fish forever,” said Don Gentry, Klamath Tribes chairman. “Agriculture should be based on what’s sustainable. There’s too many people after too little water.”

But with the Klamath Tribes enforcing their senior water rights to help suckerfish, there is no extra water for downriver salmon — and now tribes on different parts of the river find themselves jockeying for the precious resource.

The Karuk Tribe last month declared a state of emergency, citing climate change and the worst hydrologic conditions in the Klamath River Basin in modern history. Karuk tribal citizen Aaron Troy Hockaday Sr. used to fish for salmon at a local waterfall with a traditional dip net. But he says he hasn’t caught a fish in the river since the mid-1990s.

“I got two grandsons that are 3 and 1 years old. I’ve got a baby grandson coming this fall. I’m a fourth-generation fisherman, but if we don’t save that one fish going up the river today, I won’t be able to teach them anything about our fishing,” he said. “How can I teach them how to be fishermen if there’s no fish?”

‘IT’S LIKE A BIG, DARK CLOUD’

The downstream tribes’ problems are compounded by hydroelectric dams, separate from the irrigation project, that block the path of migrating salmon.

In most years, the tribes 200 miles (320 kilometers) to the southwest of the farmers, where the river reaches the Pacific, ask the Bureau of Reclamation to release pulses of extra water from Upper Klamath Lake. The extra flows mitigate outbreaks of a parasitic disease that proliferates when the river is low.

This year, the federal agency refused those requests, citing the drought.

Now, the parasite is killing thousands of juvenile salmon in the lower Klamath River, where the Karuk and Yurok tribes have coexisted with them for millennia. Last month, tribal fish biologists determined 97% of juvenile spring chinook on a critical stretch of the river were infected; recently, 63% of fish caught in research traps near the river’s mouth have been dead.

The die-off is devastating for people who believe they were created to safeguard the Klamath River’s salmon and who are taught that if the salmon disappear, their tribe is not far behind.

“Everybody’s been promised something that just does not exist anymore,” said Holt, the Yurok fisheries expert. “We are so engrained within our environment that we do see these changes, and these changes make us change our way of life. Most people in the world don’t get to see that direct correlation — climate change means less fish, less food.”

Hundreds of miles to the northeast, near the river’s source, some of the farmers who are seeing their lives upended by the same drought now say a guarantee of less water — but some water — each year would be better than the parched fields they have now. And there is concern that any problems in the river basin — even ones caused by a drought beyond their control — are blamed on a way of life they also inherited.

“I know turning off the project is easy,” said Tricia Hill, a fourth-generation farmer who returned to take over the family farm after working as an environmental lawyer.

“But sometimes the story that gets told … doesn’t represent how progressive we are here and how we do want to make things better for all species. This single-species management is not working for the fish — and it’s destroying our community and hurting our wildlife.”

DuVal’s daughter also dreams of taking over her family’s farm someday. But DuVal isn’t sure he and his wife, Erika, can hang onto it if things don’t change.

“To me it’s a like a big, dark cloud that follows me around all the time. It’s depressing knowing that we had a good business and that we had a plan on how we’re going to grow our farm and to be able to send my daughters to a good college,” said DuVal. “And that plan just unravels further and further with every bad water year.”

Sixty years of climate change warnings: the signs that were missed (and ignored)

Homes destroyed by a storm in New York state in 1962.
Homes destroyed by a storm in New York state in 1962. Photograph: Bettmann/Getty/Guardian Design.
In August 1974, the CIA produced a study on “climatological research as it pertains to intelligence problems”. The diagnosis was dramatic. It warned of the emergence of a new era of weird weather, leading to political unrest and mass migration (which, in turn, would cause more unrest). The new era the agency imagined wasn’t necessarily one of hotter temperatures; the CIA had heard from scientists warning of global cooling as well as warming. But the direction in which the thermometer was travelling wasn’t their immediate concern; it was the political impact. They knew that the so-called “little ice age”, a series of cold snaps between, roughly, 1350 and 1850, had brought not only drought and famine, but also war – and so could these new climatic changes.

 

“The climate change began in 1960,” the report’s first page informs us, “but no one, including the climatologists, recognized it.” Crop failures in the Soviet Union and India in the early 1960s had been attributed to standard unlucky weather. The US shipped grain to India and the Soviets killed off livestock to eat, “and premier Nikita Khrushchev was quietly deposed”.

But, the report argued, the world ignored this warning, as the global population continued to grow and states made massive investments in energy, technology and medicine.

Meanwhile, the weird weather rolled on, shifting to a collection of west African countries just below the Sahara. People in Mauritania, Senegal, Mali, Burkina Faso, Niger and Chad “became the first victims of the climate change”, the report argued, but their suffering was masked by other struggles – or the richer parts of the world simply weren’t paying attention. As the effects of climate change started to spread to other parts of the world, the early 1970s saw reports of droughts, crop failures and floods from Burma, Pakistan, North Korea, Costa Rica, Honduras, Japan, Manila, Ecuador, USSR, China, India and the US. But few people seemed willing to see a pattern: “The headlines from around the world told a story still not fully understood or one we don’t want to face,” the report said.

Floods in Benares, India, circa 1970.
Floods in Benares, India, circa 1970. Photograph: Paolo KOCH/Gamma-Rapho/Getty Images

 

This claim that no one was paying attention was not entirely fair. Some scientists had been talking about the issue for a while. It had been in newspapers and on television, and was even mentioned in a speech by US president Lyndon Johnson in 1965. A few months before the CIA report was issued, the US secretary of state, Henry Kissinger, had addressed the UN under a banner of applying science to “the problems that science has helped to create”, including his worry that the poorest nations were now threatened with “the possibility of climatic changes in the monsoon belt and perhaps throughout the world”.

Still, the report’s authors had a point: climate change wasn’t getting the attention it could have, and there was a lack of urgency in discussions. There was no large public outcry, nor did anyone seem to be trying to generate one.

Although initially prepared as a classified working paper, the report ended up in the New York Times a few years later. By this point, February 1977, the problem of burning fossil fuels was seen more through the lens of the domestic oil crisis rather than overseas famine. The climate crisis might still feel remote, the New York Times mused, but as Americans feel the difficulties of unusual weather combined with shortages of oil, perhaps this might unlock some change? The paper reported that both energy and climate experts shared the hope “that the current crisis is severe enough and close enough to home to encourage the interest and planning required to deal with these long-range issues before the problems get too much worse”.

And yet, if anything, debate about climate change in the last third of the 20th century would be characterized as much by delay as concern, not least because of something the political analysts at the CIA seem to have missed: fightback from the fossil fuel industries.


When it came to constructing that delay, the spin doctors could find building materials readily available within the scientific community itself. In 1976, a young climate modeller named Stephen Schneider decided it was time for someone in the climate science community to make a splash. As a graduate student at Columbia University, Schneider wanted to find a research project that could make a difference. While hanging out at the Nasa Goddard Institute for Space Studies, he stumbled across a talk on climate models. He was inspired: “How exciting it was that you could actually simulate something as crazy as the Earth, and then pollute the model, and figure out what might happen – and have some influence on policy in a positive way,” he later recalled.

After years of headlines about droughts and famine, Schneider figured the time was right for a popular science book on the danger climate change could cause. The result was his 1976 book, The Genesis Strategy. Although he wanted to avoid positioning himself alongside either what he called the “prophets of doom” on one side or the “Pollyannas” on the other, he felt it was important to impart the gravity of climate change and catch people’s attention.

And attention it got, with a jacket endorsement from physicist Carl Sagan, reviews in the Washington Post and New York Times, and an invitation to appear on Johnny Carson’s Tonight Show. This rankled some of the old guard, who felt this just wasn’t the way to do science. Schneider’s book drew an especially scathing attack from Helmut Landsberg, who had been director of the Weather Bureau’s office of climatology, and was now a well-respected professor at the University of Maryland.

Landsberg reviewed the book for the American Geophysical Union, calling it a “wide-ranging potpourri of science, nature and politics”, and “multidisciplinary, as promised, but also very undisciplined”. Landsberg disliked what he saw as an activist spirit in Schneider, believing that climate scientists should stay out of the public spotlight, especially when it came to the uncertainties of climate modelling. He would only endanger the credibility of climatologists, Landsberg worried; much better to stay collecting data to iron out as many uncertainties as possible, only guardedly briefing politicians behind closed doors when absolutely needed. In an example of first-class scientific bitching, Landsberg concluded his review by noting that Schneider advocated scientists running for public office, and that perhaps he had better try that himself – but that if he did want to be a serious scientist, “one might suggest that he spend less time going to the large number of meetings and workshops that he seems to frequent” and join a scientific library.

Nomads pick up bran sticks dropped by plane from the French air force, during a 1974 drought in Sahel, south of the Sahara Desert. (Photo by Alain Nogues/Sygma/Sygma via Getty Images).
Nomads pick up bran sticks dropped by the French airforce, during a 1974 drought in Sahel, south of the Sahara Desert. Photograph: Alain Nogues/Sygma/Getty Images

 

In part, it was a generational clash. Schneider belonged to a younger, more rebellious cohort, happy to take science to the streets. In contrast, Landsberg had spent a career working carefully with government and the military, generally behind closed doors, and was scared that public involvement might disrupt the delicate balance of this relationship. What’s more, the cultural norms of scientific behavior that expect a “good” scientist to be guarded and avoid anything that smells remotely of drama were deeply embedded – even when, like any deeply embedded cultural norm, they can skew the science. Landsberg was far from the only established meteorologist bristling at all this new attention given to climate change. Some felt uneasy about the drama, while others didn’t trust the new technologies, disciplines and approaches being used.

In the UK, the head of the Met Office, John Mason, called concern about climate change a “bandwagon” and set about trying to “debunk alarmist US views”. In 1977 he gave a public talk at the Royal Society of Arts, stressing that there were always fluctuations in climate, and that the recent droughts were not unprecedented.

He agreed that if we were to continue to burn fossil fuels at the rate we were, we might have 1C warming, which he thought was “significant”, in the next 50-100 years; but on the whole, he thought, the atmosphere was a system that would take whatever we threw at it. Plus, like many of his contemporaries, he figured we would all move over to nuclear power, anyway. Writing up the talk for Nature, John Gribbin described the overall message as “don’t panic”. He reassured readers there was no need to listen to “the prophets of doom”.


Change was coming, though, and it would be a combination of an establishment scientist and an activist that would kick it off . An obscure 1978 US Environmental Protection Agency report on coal ended up on the desk of Rafe Pomerance, a lobbyist at the DC offices of Friends of the Earth. It mentioned the “greenhouse effect”, noting that fossil fuels could have significant and damaging impacts on the atmosphere in the next few decades.

He asked around the office and someone handed him a recent newspaper article by a geophysicist called Gordon MacDonald. MacDonald was a high-ranking American scientist who had worked on weather modification in the 1960s as an advisor to Johnson. In 1968 he had written an essay called How to Wreck the Environment, imagining a future in which we had resolved threats of nuclear war but instead weaponized the weather. Since then he had watched people do this – not deliberately, as a means of war, but more carelessly, simply by continuing to burn fossil fuels.

More importantly, MacDonald was also a “Jason” – a member of a secret group of elite scientists who met regularly to give the government advice, outside of the public eye. The Jason group had met to discuss carbon dioxide and climate change in the summers of 1977 and 1978, and MacDonald had appeared on US TV to argue that the earth was warming.

Professor Stephen Schneider talks at Stanford University in 2008.
Professor Stephen Schneider talks at Stanford University in 2008. Photograph: ZUMA Press, Inc./Alamy

 

You might imagine there was some culture clash between Pomerance, a Friends of the Earth lobbyist, and MacDonald, a secret military scientist, but they made a powerful team. They got a meeting with Frank Press, the president’s science advisor, who brought along the entire senior staff of the US Office of Science and Technology. After MacDonald outlined his case, Press said he would ask the former head of the meteorology department at MIT, Jule Charney, to look into it. If Charney said a climate apocalypse was coming, the president would act.

Charney summoned a team of scientists and officials, along with their families, to a large mansion at Woods Hole, on the south-western spur of Cape Cod. Charney’s brief was to assemble atmospheric scientists to check the Jasons’ report, and he invited two leading climate modellers to present the results of their more detailed, richer models: James Hansen at the Goddard Institute for Space Studies at Columbia University in New York, and Syukuro Manabe of the Geophysical Fluid Dynamics Lab in Princeton.

The scientific proceedings were held in the old carriage house of the mansion, with the scientists on a rectangle of desks in the middle and political observers around the side. They dryly reviewed principles of atmospheric science and dialled in Hansen and Manabe. The two models offered slightly different warnings about the future, and in the end, Charney’s group decided to split the difference. They felt able to say with confidence that the Earth would warm by about 3C in the next century, plus or minus 50% (that is, we would see warming between 1.5C or 4C). In their report of November 1979, Science magazine declared: “Gloomsday predictions have no fault.”

By the mid-1970s, the biggest oil company in the world, Exxon, was starting to wonder if climate change might finally be about to arrive on the political agenda and start messing with its business model. Maybe it was the reference in the Kissinger speech, or Schneider’s appearance on the Tonight Show. Or maybe it was just that the year 2000 – the point after which scientists warned things were going to start to hurt – didn’t seem quite so far off.

In the summer of 1977, James Black, one of the top science advisors at Exxon, made a presentation on the greenhouse effect to the company’s most senior staff. This was a big deal: executives at that level would only want to know about science that would affect the bottom line. The same year, the company hired Edward David Jr to head up their research labs. He had learned about climate change while working as an advisor to Nixon. Under David, Exxon started to build a small research project on carbon dioxide. Small, at least, by Exxon standards – at $1m a year, it was a good chunk of cash, just not much compared with the $300m a year the company spent on research at large.

In December 1978, Henry Shaw, the scientist leading Exxon’s carbon dioxide research, wrote in a letter to David that Exxon “must develop a credible scientific team” one that can critically evaluate science that comes in on the topic, and “be able to carry bad news, if any, to the corporation”.

Starving cattle roam a cracked landscape in Mauritania in search of water, 1978.
Starving cattle roam a cracked landscape in Mauritania in search of water, 1978. Photograph: Alain Nogues/Sygma/Getty Images

 

Exxon fitted out one of its largest supertankers with custom-made instruments to do ocean research. Exxon wanted to be taken seriously as a credible player, so wanted leading scientists on board, and was willing to ensure they had scientific freedom. Indeed, some of the work they undertook with oceanographer Taro Takahashi would be later used in a 2009 paper concluding that the oceans absorb only 20% of carbon dioxide emitted from human activities. This work earned Takahashi a Champion of the Earth prize from the UN.

In October 1982, David told a global warming conference financed by Exxon: “Few people doubt that the world has entered an energy transition, away from dependence upon fossil fuels and toward some mix of renewable resources that will not pose problems of CO2 accumulation.”

The only question, he said, was how fast this would happen. Maybe he really saw Exxon as about to lead the way on innovation to zero-carbon fuels, with his R&D lab at the center of it. Or maybe the enormity of the challenge hadn’t really sunk in. Either way, by the mid-1980s the carbon dioxide research had largely dried up.


When Ronald Reagan was elected in November 1980, he appointed lawyer James G Watt to run the Department of the Interior. Watt had headed a legal firm that fought to open public lands for drilling and mining, and already had a reputation for hating conservation projects, as a matter of policy and of faith. He once famously described environmentalism as “a leftwing cult dedicated to bringing down the type of government I believe in”. The head of the National Coal Association pronounced himself “deliriously happy” at the appointment, and corporate lobbyists started joking: “How much power does it take to stop a million environmentalists? One Watt.”

Watt didn’t close the EPA, as people initially feared he would, but he did appoint Anne Gorsuch, an anti-regulation zealot who cut it by a quarter. Pomerance and his colleagues in the environmental movement were going to be busy. They didn’t exactly have much time for picking up that lingering and still quite abstract problem of climate change. It would still be a while before Pomerance would see a public movement for climate action.

Just before the November 1980 election, the National Academy of Sciences (NAS) had set up a new Carbon Dioxide Assessment Committee to do a follow-up to the Charney report. The chair was Bill Nierenberg, one of the generation of scientists who, like Helmut Landsberg, had been through both the war and the subsequent boom in science funding. He was quite at home working with the government and military. He was even a Jason. He had been a fierce defender of the Vietnam war, which had set him apart from some of his colleagues, and he was still bitter about some of the leftwing protests on campus at the end of the 1960s, and the pushback against military-sponsored science that they had inspired. He also hated the environmentalist movement, which he saw as a band of Luddites, especially on the issue of nuclear power. In many ways, he must have seemed like the perfect person to lead a review that would report back to the new President Reagan.

Firefighters at work in Torres del Paine National Park, Chile, in 2012.
Firefighters at work in Torres del Paine National Park, Chile, in 2012. Photograph: STR/AP

 

Nierenberg decided to build his report around a mix of economics and science. In theory, this should have been brilliant. But when it came to publication, the two sides did not cohere. The writers had not worked together, but rather been sent off to be scientists in one corner and economists in another. It has been described as a report of two quite different views – five chapters by scientists that agreed global warming was a major problem, and then two more by economists that focused on the uncertainty that still existed about the physical impacts, especially beyond the year 2000, and even greater uncertainty about how this would play out economically. What’s more, it was the economists’ take on things that got to frame the report, as the first and last chapters, and whose analysis dominated the overall message. Nierenberg seemed to be advocating a wait-and-see approach. There is no particular solution to the problem, he argued at the start of the report, but we can’t avoid it: “We simply must learn to deal more effectively with their twists and turns as they unfold.”

For their 2010 book about climate skepticism, Merchants of Doubt, Naomi Oreskes and Erik Conway dug out the peer-review notes on Nierenberg’s report from the NAS archives. One of the reviews was from Alvin Weinberg, a physicist who had been raising concerns about climate change since the 1970s, and he was less than impressed. In fact, it might be better to say he was appalled by the stance Nierenberg had taken. At one point the report had suggested people would probably adapt, largely by moving. People had migrated because of climate change in the past, it argued, and they would manage again: “It is extraordinary how adaptable people can be,” the report muses.

Weinberg was scathing: “Does the committee really believe the United States or Western Europe or Canada would accept the huge influx of refugees from poor countries that have suffered a drastic shift in rainfall pattern?” Oreskes and Conway did some digging into the reviews and noted that Weinberg’s was not the only negative one (although the others were slightly more polite). Puzzled as to why these criticisms were not responded to, a senior scientist later explained to them: “Academy review was much more lax in those days.”

In the end, the report was launched in October 1983, at a formal gala with cocktails and dinner at the NAS’s cathedral-like Great Hall. Peabody Coal, General Motors and Exxon were all on the invite list – and Pomerance managed to sneak in via the press conference. The White House had briefed the Academy from the get-go, making it clear it did not approve of speculative, alarmist or “wolf-crying” scenarios; that it thought technology would find the answer and it did not expect to do anything other than fund research and see what happened. The NAS knew these people would be in charge for the next few years, and possibly figured that the best idea was to give them the most scientific version they could find of what the White House wanted. Or possibly it simply was what Nierenberg believed. Either way, from the perspective of today, it’s hard not to see it as a big misstep.

The report’s introduction stated up front: “Our stance is conservative: we believe there is reason for caution, not panic.” At the press conference, Roger Revelle, the first scientist to brief Congress on the climate crisis, back in 1957, told reporters they were flashing an amber light, not a red one. And so, the Wall Street Journal reported: “A panel of top scientists has some advice for people worried about the much-publicized warming of the Earth’s climate: you can cope.”


Where were the activists in all of this? Where was that big public movement for action on climate change that campaigners such as Pomerance were longing for? Environmental groups were booming, both in mainstream NGOs and more radical groups, but they tended to focus on other environmental issues, such as saving the whale or the rainforests, or fighting road-building. It wasn’t really until the 2000s that we saw the emergence of climate-specific groups and climate dominating the larger NGOs’ portfolios.

If anything, the first really active, explicit climate campaigners were the skeptics. Climate skepticism is as old as climate science itself, and in the early days it was an entirely sensible position. It is normal for scientists to raise a quizzical eyebrow when something new is presented to them. The oil industry took this natural scientific skepticism and tapped it.

A flooded farm Hato Grande on the northern outskirts of Bogota, Colombia. 2011.
A flooded farm Hato Grande on the northern outskirts of Bogota, Colombia. 2011. Photograph: William Fernando Martinez/AP

 

But just as the consensus about the greenhouse effect was starting to harden, and the skeptics starting to fall away, in the 1980s, there was a deliberate, organized effort to amplify that natural doubt, extend it, and use it to dismiss and distract from warnings to take action on climate change. And that wasn’t science, even if on occasion it used scientists – that was PR. It did not necessarily mean creating phoney science. (That could work, too, but would only get you so far.) You would fund real scientists, but in a way that would confuse and muddy the message. They had done this before, with air pollution in the 1940s, and their PR companies had picked up a trick or two from fights about the links between tobacco and cancer.

The chief executives of the major oil companies met and agreed to set aside funds – only $100,000 for now, but it would grow – to work on climate policy, establishing the very legitimate-sounding Global Climate Coalition. Before long, groups such as this started to proliferate – the Information Council on the Environment, the Cooler Heads Coalition, the Global Climate Information Project – and any science-smelling voice expressing skeptical views was amplified. Bill Nierenberg was a particular favorite. The delayers knew their best strategy was to get involved in the scientific and policy debate – it was there that they would be best placed to push the uncertainties and question regulations. Sometimes fossil fuel companies and their defenders get painted as “anti-science”. In truth they run on science, and always have done – they are just strategic about which bits of it they use.


One of the hardest parts of writing about the history of the climate crisis was stumbling across warnings from the 1950s, 60s and 70s, musing about how things might get bad sometime after the year 2000 if no one did anything about fossil fuels. They still had hope back then. Reading that hope today hurts.

We are now living our ancestors’ nightmares, and it didn’t have to be this way. If we are looking to apportion blame, it is those who deliberately peddled doubt that should be first in line. But it is also worth looking at the cultures of scientific work that have developed over centuries, some of which could do with an update. The doubt-mongers manipulated positive forces in science – such as skepticism – for their own ends, but they also made use of other resources, exacerbating generational divides, exploiting the scientific community’s tendency to avoid drama, and steering notions about who were legitimate political partners (eg governments) and who were not (activists).

Scientists working on climate change have been put in an incredibly difficult position. They should have been given time, expert support and a decent budget to think about the multiple challenges and transformations that happen when you take a contentious bit of science out of the scientific community and put it in the public sphere. They should have been given that support from government, but they also needed the gatekeepers within the scientific community to help them, too. And yet, if anything, many of these scientists have been ridiculed by their colleagues for speaking to media or – perish the thought – showing emotion.

climate change divide illustration
How climate skepticism turned into something more dangerous
Read more

 

As citizens of the 21st century, we have inherited an almighty mess, but we have also inherited a lot of tools that could help us and others survive. A star among these tools – sparkling alongside solar panels, heat pumps, policy systems and activist groups – is modern climate science. It really wasn’t all that long ago that our ancestors simply looked at air and thought it was just that – thin air – rather than an array of different chemicals; chemicals that you breathe in or out, that you might set fire to or could get high on, or that might, over several centuries of burning fossil fuels, have a warming effect on the Earth.

When climate fear starts to grip, it is worth remembering that we have knowledge that offers us a chance to act. We could, all too easily, be sitting around thinking: “The weather’s a bit weird today. Again.”

This is an edited extract from Our Biggest Experiment: An Epic History of the Climate Crisis by Alice Bell, published on 8 July by Bloomsbury and available at guardianbookshop.co.uk

… we have a small favor to ask. Perhaps you’re familiar with the Guardian’s reputation for hard-hitting, urgent reporting on the environment, which tens of millions of people read every year. We need your support to continue providing high-quality coverage from across a rapidly changing world.

The Guardian views the climate crisis as the defining issue of our time. It is already here. Mega-droughts, wildfires, flooding and extreme heat are making growing parts of our planet uninhabitable. As parts of the world emerge from the pandemic, carbon emissions are again on the rise, risking a rare opportunity to transition to a more sustainable future.

The Guardian has renounced fossil fuel advertising, becoming the first major global news organization to do so. We have committed to achieving net zero emissions by 2030. And we are consistently increasing our investment in environmental reporting, recognizing that an informed public is crucial to keeping the worst of the crisis at bay.

More than 1.5 million readers, in 180 countries, have recently taken the step to support us financially – keeping us open to all and fiercely independent. With no shareholders or billionaire owner, we can set our own agenda and provide trustworthy journalism that’s free from commercial and political influence, offering a counterweight to the spread of misinformation. When it’s never mattered more, we can investigate and challenge without fear or favour.

Unlike many others, Guardian journalism is available for everyone to read, regardless of what they can afford to pay. We do this because we believe in information equality. Greater numbers of people can keep track of global events, understand their impact on people and communities, and become inspired to take meaningful action.

If there were ever a time to join us, it is now. Every contribution, however big or small, powers our journalism and sustains our future. Support the Guardian from as little as $1 – it only takes a minute. Thank you.

Solar Is Dirt-Cheap and About to Get Even More Powerful

Bloomberg

Solar Is Dirt-Cheap and About to Get Even More Powerful

After focusing for decades on cutting costs, the solar industry is shifting attention to making new advances in technology.

The solar industry has spent decades slashing the cost of generating electricity direct from the sun. Now it’s focusing on making panels even more powerful.

With savings in equipment manufacturing hitting a plateau and more recently pressured by rising prices of raw materials, producers are stepping up work on advances in technology — building better components and employing increasingly sophisticated designs to generate more electricity from the same-sized solar farms.

“The first 20 years in the 21st century saw huge reductions in module prices, but the speed of the reduction started to level off noticeably in the past two years,” said Xiaojing Sun, global solar research leader at Wood Mackenzie Ltd. “Fortunately, new technologies will create further cost-of-electricity reductions.”

Solar Slide

Photovoltaic panel cost declines have slowed in recent years

Source: PVinsights

A push for more powerful solar equipment underscores how further cost reductions remain essential to advance the shift away from fossil fuels. While grid-sized solar farms are now typically cheaper than even the most advanced coal or gas-fired plants, additional savings will be required to pair clean energy sources with the expensive storage technology that’s needed for around-the-clock carbon-free power.

High levels of cancer-causing chemical found in parts of Houston -report

High levels of cancer-causing chemical found in parts of Houston -report

 

FILE PHOTO: The Houston Ship Channel and adjacent refineries, part of the Port of Houston, are seen in Houston.

 

HOUSTON (Reuters) -High levels of a cancer-causing chemical have been detected in air monitors in Houston neighborhoods near the busiest U.S. petrochemical port, according to a report issued on Thursday by Houston health officials and environmental groups.

The report https://bit.ly/3hqafvk by the Houston Health Department and One Breath Partnership said concentrations of formaldehyde were found at levels 13 times the U.S. Environmental Protection Agency’s minimum level for health threats.

It recommended regulations for plants and control of chemicals contributing to formaldehyde formation be tightened. Formaldehyde levels appear to be increasing in Houston as the Texas Commission on Environmental Quality’s air monitoring sampling frequency is decreasing, the report said.

Houston Mayor Sylvester Turner said the report is further proof of the impact of pollution on “high-poverty communities of color.” (Reuters photo essay on pollution in Houston) https://reut.rs/3hqazdw

“The Texas Commission on Environmental Quality has the responsibility to take immediate action to strengthen existing rules to address the formaldehyde problem plaguing families near the Houston Ship Channel,” Turner said.

Formaldehyde or chemicals that combine to form it are released by refineries, chemical plants and automobiles.

The Houston Health Department between September 2019 and September 2020 tested an area along the Houston Ship Channel that is home to several petrochemical plants and five crude oil refineries.

The highest concentrations of formaldehyde found “would translate to about one additional cancer case per 77,000 people, according to the Houston Health Department’s assessment of EPA’s cancer risk formulas,” the report said.

The report identified plants operated by Exxon Mobil, Chevron, Koch Industries’ Flint Hills Resources and NRG Energy as sources for formaldehyde or the chemicals that combine to form it.

“We are committed to operate in a manner that safeguards our environment and protects our people and community,” said Exxon spokeswoman Julie King. “Exxon Mobil has invested billions on environmental performance measures at our U.S. manufacturing sites over the past 20 years.”

(Reporting by Erwin Seba; Editing by David Gregorio and Lisa Shumamker)

Track The Brutal 2021 Wildfire Season With These Updating Charts And Maps

Track The Brutal 2021 Wildfire Season With These Updating Charts And Maps

 

The West is a tinderbox this year, with heat waves and high winds through summer and fall expected to create the conditions for yet another brutal fire season.

 

“It’s just scary,” Alexandra Syphard, chief scientist with Vertus Wildfire Insurance Services and an ecologist at San Diego State University, told BuzzFeed News. “We’ve seen these severe fire seasons year after year now. Everybody’s nervous.”

The charts and maps below will update to track current wildfires and air quality, compare the 2021 season to previous years, and monitor the weather conditions that make fires more likely to ignite and spread quickly.

Via National Interagency Fire Center

Latest fires

Latest fires, 50 acres burned or more. Peter Aldhous / BuzzFeed News / Via National Interagency Fire Center / Cal Fire

This table displays active fires that have so far burned 50 acres or more, recorded by the National Interagency Fire Center and the California Department of Forestry and Fire Protection, or Cal Fire. You can search for any of the fires in the table in the map below to zoom in on the fire and see the perimeter for the area burned, if that data is available.

Peter Aldhous / BuzzFeed News / Via National Interagency Fire Center / Cal Fire

Tap or hover over the fire icons to see the name of each fire and the area it has burned so far. The map also shows any large plumes of smoke visible from satellites, recorded by the National Oceanic and Atmospheric Administration’s Hazard Mapping System.

Air quality

Smoke plumes visible from orbiting satellites are often at high altitudes, so they may not affect air quality at ground level. But when wildfire smoke accumulates near the ground, it is hazardous for health.

Peter Aldhous / BuzzFeed News / Via AirNow

This map shows the latest “NowCast” Air Quality Index (AQI) readings from permanent monitors in the Environmental Protection Agency’s AirNow network. The monitors detect levels of tiny, hazardous particles called PM2.5, extrapolated over wider areas where sufficient data is available.

These airborne particles, which measure less than 2.5 micrometers across, are the main health concern from wildfire smoke because they penetrate deep into the lungs, enter the bloodstream, and can even affect unborn fetuses, lowering birth weight if pregnant people are exposed to smoke. PM2.5 can also trigger heart attacks, asthma, and other respiratory problems.

PM2.5 starts to affect vulnerable people, including young children and those with respiratory conditions, above an AQI of 100. Anything above 200 is considered “very unhealthy” for everyone, while an AQI of 300 or more is rated “hazardous” for all.

Tap or hover over the circles to see the latest PM2.5 AQI for each monitor. You can also type a city into the search box to zoom in on that area.

Current drought conditions

Peter Aldhous / BuzzFeed News / Via US Drought Monitor

One reason experts are so concerned about the 2021 wildfire season is that the West is in the grip of a historic drought. An unusually dry winter left soils and vegetation parched, and mountain snowpacks, which feed the region’s rivers, were well below normal. This map shows the latest assessment from the US Drought Monitor, which is updated each Thursday.

Fire weather and risks

Peter Aldhous / BuzzFeed News / Via Wildland Fire Assessment System / National Weather Service

Even against the backdrop of widespread drought, the risk of fires igniting and spreading rapidly depends on local weather conditions. This map shows today’s outlook from the US Forest Service’s Wildland Fire Assessment System, which calculates risk categories from weather forecasts and observations.

You can use the control at top right to view areas currently under “red flag” fire warnings issued by the National Weather Service. These are declared when warm weather, low humidity, and strong winds are forecast to produce a high wildfire danger.

How the 2021 fire season compares to recent years

Chart compares the 2021 season to the previous 10 years up until the same date. Peter Aldhous / BuzzFeed News / Via National Interagency Fire Center

This chart compares the number of fires and total area burned so far in 2021 to the same date in each of the previous 10 years, according to data recorded by the National Interagency Fire Center.

But these numbers don’t tell the whole story. How hazardous wildfires are to people depends on when and where those fires happen. For example, large areas of the sparsely populated Mountain West or Plains states can burn without many homes, businesses, or people’s lives or health being threatened.

Last year showed how quickly circumstances can change. The 2020 season was running below the national average for the decade until mid-August, when “dry lightning” storms ignited a series of massive wildfires across Northern California and Oregon. The resulting disaster burned through entire towns in Oregon and created a pall of smoke that blocked the sun, casting large parts of the region in a sickly orange half-light in September and driving air quality into the hazardous range in some cities. By the end of the year, more than 10,000 buildings had been damaged or destroyed in California alone, and the total area burned nationally, at more than 10 million acres, was the second largest on record.

Climate change is making things worse

Total acres burned at the end of each year. Peter Aldhous / BuzzFeed News / Via National Interagency Fire Center

This chart shows the total acreage burned at the end of each year from 1983 — when federal agencies began tracking using the current reporting system — through to 2020. While the area burned varies widely from year to year, the overall trend is increasing.

Fire ecologists and climate scientists attribute this trend in large part to climate change, which is warming and drying California and other states across the West.

While media commentators sometimes describe recent severe fire seasons as the “new normal,” the truth is that ongoing climate change means things are likely to get worse for the foreseeable future. “We have not reached the peak,” Daniel Swain, a climate scientist at the University of California, Los Angeles, told BuzzFeed News last year. “In fact, no one knows where the peak is.”

Why the Northwest’s heat wave didn’t just break records, it obliterated them

Mashable – Climate Change

Why the Northwest’s heat wave didn’t just break records, it obliterated them
“It’s a staggering event.”
By Mark Kaufman                              June 28, 2021

 

The heat wave in the Pacific Northwest shattered temperature records.

The heat wave in the Pacific Northwest shattered temperature records. Credit: Rapeepong Puttakumwong / Getty Images

When all-time heat records break, they usually break by a degree or so. Maybe two.The heat wave in the Pacific Northwest and beyond, however, smashed Portland’s all-time record by a whopping nine degrees Fahrenheit, and in some places the extreme episode broke all-time records by 10 degrees. Canada, meanwhile, set its all-time national record by some 5 F in British Columbia, and may again break this record on Tuesday.”It’s a staggering event,” explained Jeff Weber, a research meteorologist at the University Corporation for Atmospheric Research, an organization that facilitates and performs earth science.

What happened?

Meteorologists and atmospheric scientists already knew the heat would be oppressive and challenge all sorts of records. A potent combination of events came together: A hot weather pattern (called a heat dome) settled over the region, with temperatures also boosted some 2 degrees F (or perhaps much more) by the continuously warming Western climate (it’s significantly hotter than it was 100 years ago). What’s more, nearly 80 percent of the Pacific Northwest is in drought, and drought exacerbates heat.

But that’s not all. Another weather factor came into play and kicked things up a notch. Dry winds, traveling downslope from the east to west, amplified the heat. These type of winds are commonly known to southern Californians as “Santa Ana winds,” but have different names in different places. Generally, the winds travel down from higher elevations (like mountains in eastern Oregon) and the sinking air compresses, creating even more heat in lower areas, like Portland. (This is also called “compressional heating.”)

 “It’s the perfect storm.”Those hot winds have amplified what already was an exceptional heat event, explained Weber. Many cooler coastal areas weren’t even spared.

“It’s the perfect storm,” said Weber.

The resulting temperatures are unparalleled in recorded history in the Pacific Northwest. Many people, and buildings, are ill-prepared for this kind of heat. Seattle, for example, is the least air-conditioned metro area in the U.S., according to The Seattle Times.

“It’s hitting an area where people don’t have AC,” noted Weber. “The discomfort level for the population is just overwhelming.” Indeed, illnesses from heat spiked in the Portland region during the extreme weather. Heat illness is serious: Among weather events in the U.S., extreme heat waves kill the most people.

Few heat waves are as anomalous as this Pacific Northwest heat event. As noted above, strong meteorological and climatic events came together at June’s end. But, overall, today’s heat or high temperature records now dominate cold or lower records as the globe warms. For example, twice as many daily heat records are set as cold records.

Extreme events, hot or cold, happen normally. But climate scientists expect heat waves to grow more intense in a warmer world.

“Climate change is making extreme heat waves even more extreme and common,” Daniel Swain, a climate scientist at UCLA and the National Center for Atmospheric Research, told Mashable last week, before the record-breaking heat set in.

UPDATE: June 29, 2021, 9:46 a.m. EDT: This story was updated to reflect that Portland smashed its previous heat record of 112 F (set on Sunday) by reaching 116 F on Monday. The story also added that Canada smashed its all-time national heat record, and may do so again on Tuesday.

How Scientists Are So Confident They Know What’s Causing This Insane Weather

How Scientists Are So Confident They Know What’s Causing This Insane Weather

Nathan Howard/Getty
Nathan Howard/Getty

 

Dale Durran just endured a historic heatwave in Seattle, and perhaps more than most residents, he’s got good reason to be confident climate change had something to do with the regional madness that proved especially extreme next door, in Oregon, where dozens died.

The professor of atmospheric sciences at the University of Washington told The Daily Beast that this past week’s monstrous stretch—which topped out at a blistering 108 degrees on Monday—was “so outside the range of previous hot spells in Seattle that it really stretches the credibility of anyone suggesting it is simply natural variability.”

Durran isn’t someone who blames climate change every time he breaks a sweat. But he does think about this issue: of accurately attributing seemingly insane weather events, like an entire village being on fire in Canada this past week, to a heating planet.

Last year, in his jargon-heavy paper “Can the Issuance of Hazardous-Weather Warnings Inform the Attribution of Extreme Events to Climate Change?” Durran closely examined something called the “probability of detection” and the “false alarm ratio.”

“The point of this article,” he explained, “is that demanding scientific certainty in the face of an event such as our recent record-crushing heatwave in the Pacific Northwest before accepting the need to take action to stem global warming is as ridiculous as demanding 100 percent certainty before issuing a tornado forecast.”

He’s not alone in thinking the doubters have run out of room—that events have overtaken any shred of sane skepticism.

According to legendary Princeton geoscientist Michael Oppenheimer, scientists are no longer guessing when it comes to tying extreme events like this to climate change, because a whole new field now exists that aims to tie a nice neat bow around these very questions.

“There is now a well developed science of ‘event attribution’ which deals with uncertainty,” Oppenheimer told The Daily Beast. (His own research over the years has focused on what the specific hazards of climate change will be, not necessarily event attribution.)

Here’s Oppenheimer’s explanation of how event attribution scientists do their jobs: They use Fractional Attribution of Risk (FAR), which he said is “the fraction of the intensity of an event (like a heatwave) that can be attributed to human-made greenhouse gases.” For example, event attribution scientists calculated the FAR on 2017’s Hurricane Harvey—after the fact—and it had, Oppenheimer explained, about two times what would have been the case without the greenhouse gases at 2017 levels. That gave Harvey a FAR score of 0.5.

Got all that?

It doesn’t really matter right now, because, according to event attribution specialist Emily Williams—currently still a Ph.D. student in geography at UC Santa Barbara—there’s no FAR score for what just happened to the West Coast. “Until a formal attribution study is done on this current heatwave,” Williams, who co-authored her university’s fact sheet on heatwaves and climate change, told The Daily Beast, “we won’t be able to say specifically how much more intense or likely climate change made it.”

However, Williams said, it’s nonetheless a “pretty safe bet to venture out and say that climate change likely at least exacerbated” the situation.

That’s because what scientists call the overall “probability distribution” of heatwaves has now shifted. So even though they haven’t looked backward at this latest event (or series of events) and tied it to climate change, enough science has been done to safely say the reverse—they more or less predicted this heatwave would happen. What those past scientists have already demonstrated, according to Williams, is that we’re now twice as likely to experience record breaking temperatures, that 16 percent of North America is now exposed to extreme heatwaves, and that when heatwaves arrive, “they’re now hotter.”

Matthew Hurteau, biology researcher at the University of New Mexico, studies the interaction between human behavior and the climate system, with an emphasis on fires—like the one that just consumed an entire Canadian town that had just set a national heat record of 121 degrees. Hurteau conducts his research by looking at reality, and a model of an unchanged climate.

“When I look at the climate change fingerprint on fire in my own research, it typically involves running simulations with and without the climate changing,” he told the Daily Beast.

But he doesn’t always need a model to be certain in his own mind that climate change is to blame when he sees a fire. “When the Creek Fire was burning in the Sierra Nevada last fall and I was looking at the energy release data from satellites and how long into the year it was actively burning,” he said, “it was clear that there was a climate change amplification of that fire.”

Still, it’s worth noting that some scientists find this comparison-to-an-unchanged-world methodology unsatisfying. Not because they believe recent extreme events should be regarded as normal, but simply because we live in the aforementioned changed world, and the other one is fake.

Matthew Igel, an atmospheric scientist focused on clouds at the University of California Davis, told The Daily Beast that now that we’ve filled our atmosphere with greenhouse gases, “We will always lack a realization of Earth without climate change regardless of how excellent our models are or become.”

According to Igel’s explanation, you can think of each attribution study almost as a science-fiction story about someplace called “Earth 2,” where anthropogenic climate change didn’t occur—perhaps because humans don’t exist there. And it’s only by creating a model of Earth 2 that we can understand why it’s so hot here on Earth 1, a.k.a. the only Earth that actually exists. “Our statistical knowledge of truly extreme events from some baseline climate will always be poor,” Igel said. “And regardless, just because we have never observed something before, doesn’t mean that it was impossible, only that it didn’t happen.”

By no means did Igel dismiss the usefulness of attribution science—he’s just hesitant to call it conclusive. “These are the questions that keep me up at night,” he explained.

But the attribution scientists can offer about the connection between climate change, and, for instance, your house being burned down in a wildfire, is good enough to be used in court. At least according to Michael Burger, Columbia Law professor, and executive director at the Sabin Center for Climate Change Law, which cooks up legal techniques to be used in the fight against climate change.

“There is nothing new about courts, or policymakers, making decisions in the face of probability calculations and varying degrees of scientific uncertainty. That’s the nature of the beast,” Burger explained. “As a lawyer, you have to deploy the science to make your case, and fit it to the relevant standard for the particular legal issue you are addressing.”

“Attribution Science has improved the precision of climate data with respect to delineation of atmospheric conditions with and without human caused greenhouse gases,” added Lindene Patton, a lawyer at the legal and advisory company Earth and Water Law Group.

When it comes to expert commentary before a judge, she said, “The state of the art attribution science is good enough—as good as morbidity data or demographic or other data we use.”

So the evidence—as in courtroom evidence—for climate change being the culprit in this heatwave, isn’t in quite yet. But that doesn’t mean the case against the defendant, anthropogenic climate change, isn’t looking extremely strong.

“Climate change caused by high levels of greenhouse gas changes all such events to some degree,” Oppenheimer said, “and I wait to see what the scientists who do these calculations say before deciding specifically how the character of an event was affected by the greenhouse gas buildup. That is precisely what I am waiting for now with regard to the recent heatwave in the Pacific Northwest.”

However, Oppenheimer added, “some events are so off the chart that you can say right off that there was very likely a big greenhouse gas contribution.” Hurricane Harvey, he said, was one of these instances where he didn’t need to wait for the evidence to be pretty sure. “There was no precedent even close to it in the local historical record. I think the same is true for the recent Pacific Northwest event.”

According to Williams, the UC Santa Barbara attribution specialist, events like this are “both a window into the future, and a reminder of why it’s so important that we take action now to transition to a just, low-carbon economy.”

Such a “just, low-carbon economy,” is still a long way off, and those who would stand on the sidelines poking holes in scientific conclusions have helped slow down its creation. According to Durran, we need to stop letting that happen, even if attribution science gets it wrong from time to time.

“Some errors will always occur both when issuing weather warnings and when distinguishing natural variability from human-induced climate change,” he said. “In neither case can we let the possibility of error completely paralyze our response.”

A map shows the 12 states most at risk from COVID-19, all with high levels of Delta and below-average vaccination rates

A map shows the 12 states most at risk from COVID-19, all with high levels of Delta and below-average vaccination rates

 

A map shows the 12 states most at risk from COVID-19, all with high levels of Delta and below-average vaccination rates
NJ covid test
COVID-19 testing. Rick Pescovitz

  • The 12 US states at most risk from COVID-19 include Arkansas, Nevada, and Missouri, according to Covid Act Now.
  • The organization uses CDC data and is partnered with Stanford, Harvard and Georgetown Universities.
  • The highest risk states include those with lower vaccination rates and more cases of the highly infectious Delta variant.

The US states most at risk from COVID-19 include those with below-average vaccination rates and high levels of the infectious Delta variant, according to data from an influential non-profit that’s partnered with Stanford, Harvard and Georgetown Universities.

Twelve states including Arkansas, Nevada, and Missouri are now at “high risk” from COVID-19, according to Covid Act Now’s US COVID risk and vaccine tracker, which mostly uses Centers for Disease Control and Prevention (CDC) data. There are just two states at low risk from COVID-19 – Massachusetts and Vermont.

Risk level according to US state by Covid Act Now
Risk level by US state – Covid Act Now covidactnow.org

 

The 12 states at high risk from COVID-19, according to Covid Act Now, are: Nevada, Utah, Missouri, Wyoming, Nebraska, Kansas, Oklahoma, Arkansas, Mississippi, Louisiana, Florida, and South Carolina.

Covid Act Now’s risk calculations are based on six factors that include infection rates, the percentage of people vaccinated, capacity at intensive care units, and socio-economic vulnerabilities that may impact recovery. There are five levels of risk: severe risk, very-high risk, high risk, medium risk and low risk.

Professor Eric Topol, director at the Scripps Research Translational Institute, said on Twitter on Friday that states had moved up risk categories as the Delta variant continued to spread, citing Covid Act Now. He said Delta accounted for at least 35% of new cases in these high-risk states.

Thirty-six states are at medium risk, while two – Massachusetts and Vermont – are low risk, according to Covid Act Now’s data. Both of these states have low levels of Delta variant, according to Scripps Research’s Outbreak.info.

More than 80% of people are fully vaccinated in both states, according to data from Johns Hopkins University – well above the nation’s 47% average.

By comparison, the Delta variant accounts for more than 80% of new cases in Arkansas, Nevada, and Missouri, according to Outbreak.info. The number of people fully vaccinated is 34% in Arkansas, 42% in Nevada, and 39% in Missouri.

Actual figures may vary because there can be delays in uploading data, or it may not be available. Insider’s Aria Bendix reported on Friday that the CDC stopped monitoring non-severe COVID-19 cases among vaccinated people in May. The number of tests sequenced also differs. For example, the data from Massachusetts is based on more than 18,000 sequenced tests, and Arkansas’ data comes from just over 960 sequenced tests, according to Outbreak.info.

It’s not clear how the level of risk will translate to new infections.

Topol said it was promising that the number of new daily cases was still low. “But rising,” he added.

The hardest question about the Florida condo collapse: Is it worth rebuilding in a city that could be underwater in 30 years?

The hardest question about the Florida condo collapse: Is it worth rebuilding in a city that could be underwater in 30 years?

The hardest question about the Florida condo collapse: Is it worth rebuilding in a city that could be underwater in 30 years?
Two luxury condominium buildings under construction on Fisher Island in Miami, Florida.
Developers continue to build in places like Fisher Island, located south of Surfside and Miami Beach, despite the growing risks posed by climate change. Jeffrey Greenberg/Getty Images
  • The cause of the Florida condo collapse is still unknown, but climate change is among early theories.
  • Experts say rising sea levels will pose major risks for other coastal residents in the near future.
  • Yet Miami real estate prices are soaring, even as some experts warn against new development.

A week after the Champlain Towers South condo building in Surfside, near Miami, Florida, collapsed, 22 are dead and more than 100 are still missing.

While speculation is already swirling about what caused the collapse, with observers blaming everything from inaction by the condo board to lax building regulations to rising sea levels, investigators are likely still months from a definitive answer.

One thing is certain, however: Climate change is already threatening to leave substantial parts of coastal areas like Miami underwater in the coming decades, meaning more buildings and infrastructure could be wiped out.

Despite the ominous signs, Miami real estate prices continue to soar and new development projects move forward, in what some experts say is a detachment from the environmental – and economic – reality.

In Florida alone, $26.3 billion worth of coastal property, housing more than 90,000 people, is at risk of becoming “chronically inundated” – that is, flooding at least 26 times per year – by 2045, according to Insider’s analysis of a 2018 report by the Union of Concerned Scientists.

By those estimates, homebuyers taking out a 30-year mortgage today would likely see their homes flooding every two weeks by the time their loan term expires.

“Florida is ground zero for sea level rise in the United States,” Kristy Dahl, a senior climate scientist at UCS, told Insider.

That rise is causing more of the state to experience flooding, not just during so-called “king tides,” but also during normal high tides, Dahl said, adding that “seawater that’s flooding communities is incredibly corrosive.”

“Regular high-tide flooding will affect all kinds of infrastructure in the coming decades,” she said, pointing out a UCS study that showed how flooding could derail Amtrak’s Northeast Corridor route by 2050.

“A way to drive our economy”

After Hurricane Andrew devastated the state in 1992, Florida passed a wave of new building codes to mitigate future storm damage. The Palm Beach Post reported Friday that the collapse in Surfside could similarly push lawmakers to abandon the state’s historically hands-off approach to regulation in favor of more stringent rules for aging condo buildings.

Following decades of denialism, more Florida Republicans have also begun to acknowledge the reality of climate change and the risks it poses for their coastal communities, paving the way for more aggressive, bipartisan efforts.

Florida’s state legislature recently authorized $640 million for climate resiliency initiatives, while the mayors of Miami, Miami Beach, and Miami-Dade County have rolled out a strategic plan outlining steps to prepare the region.

Some developers are also beginning to see a business case for investing in climate resilience.

“We need to understand about how much it’s going to cost, but ultimately… we found that the return on investment is significant and it will create thousands of jobs,” Alec Bogdanoff, CEO of Brizaga, a Florida-based civil and coastal engineering firm, told Insider.

“We’re not only investing in adaptation and resilience because we have to, but it’s actually a way to drive our economy and grow our economy,” he said.

But some experts worry that trying to adapt to the climate – through evolving construction techniques, pump systems, and raised buildings and sidewalks, for example – may still not be enough to save cities like Miami.

“Why the heck are we letting people build?”

“We know seawater is going to arrive,” Harold Wanless, a professor and chair of the department of geological science at the University of Miami, told Insider. “What we should be doing is saying: ‘Why the heck are we letting people build in an area that’s going to be flooded by rising sea levels?”

Wanless said that a 2-3 foot rise in sea level, which estimates predict could happen in Miami by 2060, would also cause 100 to 200 feet of beach erosion, a rate that would make it too expensive to combat by simply adding more sand.

“At that point, you don’t fight it, and we should be realizing that’s where we’re headed,” Wanless said.

But many still don’t, partly because various financial incentives keep pushing developers to build in high-risk areas, including their outsize influence over local politics and wealthy buyers’ ability to withstand losses, according to a report last year in Yale’s Environment360.

That report argues that the “narrow path for survival” for Florida’s coastal counties involves, among other strategies, “orderly retreats from most vulnerable coastal neighborhoods.”

But withdrawing from coastal properties, despite the science, would run up against another obstacle, according to Dahl: human nature.

“We’re still drawn to the water just as we always have been, and I think that’s going to be a really difficult cultural shift to make,” she said, especially given the lack of disclosure about climate risks in real estate listings.

In 2019, journalist Sarah Miller pretended to be interested in buying a luxury home in Miami Beach so she could ask realtors about climate-related risks, detailing the “cognitive dissonance” she witnessed in an article for Popula.

In response to a friend’s skepticism about whether cities could become climate-proof through resilience alone, Miller wrote: “This is the neoliberal notion, that the reasonable and mature way to think about this stuff is: Get more efficient and find the right incentives to encourage the right kinds of enterprise. But my friend wondered, what if the mature thing to do is to mourn – and then retreat?”