Workers at Chernobyl site say Russian soldiers drove through the highly-radioactive ‘Red Forest’ with no protective gear

Business Insider

Workers at Chernobyl site say Russian soldiers drove through the highly-radioactive ‘Red Forest’ with no protective gear

Marianne Guenot – March 31, 2022

A sign with the symbol for radioactivity is shown outside of the red forest.
A sign warns of radiation at the “red forest” near the former Chernobyl nuclear power plant on September 29, 2015.Sean Gallup/Getty Images
  • Russian soldiers drove armored vehicles through the ‘Red Forest’ as they invaded Chernobyl, per Reuters.
  • Reuters spoke to two unnamed Ukrainian staff at the decommissioned Chernobyl nuclear power plant.
  • The soldier didn’t wear protective gear and exposed themselves to radioactive dust, they said.

Russian soldiers drove armored vehicles through the most contaminated area of Chernobyl’s exclusion zone with protection, two sources told Reuters.

The trip kicked up clouds of radioactive dust that could be damaging to the troops’ health, the outlet said.

One source told the outlet that the soldiers appeared to have no idea about Chernobyl’s history as a nuclear disaster site.

Reuters spoke to two Ukrainian workers who were on duty in the decommissioned power plant when Russian soldiers invaded the site on February 25 in the first days of the invasion of Ukraine.

The workers, who asked not to be named for fears about their safety, spoke to Reuters in late March.

By that time they had just been allowed home after being confined to the decommissioned power plant by Russian soldiers for almost a month.

A yellow sign shows in Cyrillic indicates the limit of the red forest in Ukraine.
The sign marks the territory of the Red Forest, Kyiv Region, northern Ukraine on April 21, 2021.Volodymyr Tarasov/ Ukrinform/Future Publishing via Getty Images

The workers said that Russian soldiers drove armored vehicles through the Red Forest, 1.5 square miles of pine forest that died as a result of exposure shortly after the 1986 Chernobyl nuclear accident. The Red Forest remains the most contaminated area of the zone around Chernobyl, per Reuters.

“The convoy kicked up a big column of dust,” one source said, per Reuters.

One worker told Reuters that some of the soldiers he had spoken to in the plant “did not have a clue” about the 1986 catastrophe at Chernobyl, the world’s worst nuclear disaster.

“They had no idea what kind of facility they were at,” the worker said.

The worker’s accounts are in line with reports from the International Atomic Energy Agency, which said shortly after Russian troops moved into the exclusion zone that remote sensors showed the invasion had disturbed radioactive dust, raising the level of radiation around the site.

The plant has been under the control of Russian troops since February 24. The power plant was fully decommissioned after the 1986 nuclear accident and the remaining work at the site is mostly directed toward decontamination.

An active nuclear power plant in Zaporizhzhia in the south of Ukraine has also been captured and has been operating under the control of Russian troops since March 4.

The invasion of nuclear sites by the military caused outrage among nuclear experts.

Several told Insider that the risk of a catastrophic nuclear accident at either of the nuclear sites remains low, but is more likely because of the nearby fighting.

The occupation of the Chernobyl site might soon be coming to an end, according to an unnamed US defense official.

The official told France 24 on Wednesday that the Russian troops looked like they were being removed from Chernobyl and repositioned in Belarus, which has a border with Ukraine around 10 miles from the Chernobyl site.

Norway told to get Cold War bunkers ready amid fears of nuclear fallout

Yahoo! News

Norway told to get Cold War bunkers ready amid fears of nuclear fallout

Andy Wells, Freelance Writer – March 31, 2022

Pripyat , Ukraine; 14 June 2019; Pripyat is a ghost city in northern Ukraine, founded as the ninth nuclear city in the Soviet Union, to serve the nearby Chernobyl Nuclear Power Plant. It was officially proclaimed a city in 1979 and had grown to a population of 49,360 by the time it was evacuated on the afternoon of April 27, 1986, the day after the Chernobyl disaster.
A radiation warning at Pripyat, Ukraine, a ghost city that was evacuated the day after the Chernobyl disaster. (Getty)

Fears of another Chernobyl-like disaster in Ukraine have prompted warnings in Norway for citizens to “dust off” Cold War bunkers.

Since invading Ukraine last month, Russian troops have occupied Europe’s largest nuclear power plant at Zaporizhzhia as well as at the now defunct plant at Chernobyl, the scene of the world’s worst nuclear accident in 1986.

Odd Roger Enoksen, Norway’s defence minister, has now aired concerns that any accident at a Ukrainian power plant that cause a radiation leak could impact his own country if the wind travels in its direction.

According to The Times, defence sources have told civilians in Norway to start “dusting off” their bunkers at home “in case of nuclear alert”.

A view shows the abandoned city of Pripyat near the Chernobyl Nuclear Power Plant, Ukraine, April 12, 2021. Picture taken with a drone April 12, 2021. REUTERS/Gleb Garanich
A view shows the abandoned city of Pripyat near the Chernobyl Nuclear Power Plant, Ukraine. (Reuters)
Service members take part in tactical exercises, which are conducted by the Ukrainian National Guard, Armed Forces, special operations units and simulate a crisis situation in an urban settlement, in the abandoned city of Pripyat near the Chernobyl Nuclear Power Plant, Ukraine, February 4, 2022. REUTERS/Gleb Garanich
Service members take part in tactical exercises in the abandoned city of Pripyat near the Chernobyl Nuclear Power Plant, Ukraine. (Reuters)

The source told the paper: “Everyone has been warned so if they are using them for storage now they need to make a plan for taking things out.”

A 72-hour warning would be given in advance to get bunkers ready for use.

Enoksen calmed fears that the warning was due to the threat of a nuclear war, rather than fallout from an accident.

He said: “Ukraine has the most production of nuclear power in Europe and if an accident happens, as with Chernobyl, we will all in western Europe be affected by that if the wind goes in this direction.”

Watch: Ukraine warns of Chernobyl radiation leak after power cut

Ukraine warns of Chernobyl radiation leak after power cut

Ukraine has warned that radiation could be released from Chernobyl. The nuclear power plant, which is currently under the control of Russian troops, has been knocked off the power grid and cannot cool spent nuclear fuel.

He said Norway was still able to see the effects of Chernobyl adding: “In summer time we can actually see ashes from burning grass in Ukraine.”

The warning comes as US defense officials said Russian forces may have begun to pull out of Chernobyl.

According to the AFP news agency, US defense officials said troops had begun walking away from Chernobyl and moving to Belarus. “We think that they are leaving. I can’t tell you that they’re all gone,” they said.

Norway, which shares a 12-mile land border with Russia, made it compulsory for bunkers to be built in civilian infrastructure like hotels during the Cold War-era.

Norwegians have also been told to stock up on medicines for children in case of radioactive fallout.

Norway's defence minister Odd Roger Enoksen said Norway was still able to see the effects of the Chernobyl disaster. (Reuters)
Norway’s defence minister Odd Roger Enoksen said Norway was still able to see the effects of the Chernobyl disaster. (Reuters)

The warnings come as the head of Ukraine’s state nuclear company urged the UN nuclear watchdog to help ensure Russian nuclear officials do not interfere in the operation of nuclear power plants occupied by Russian forces.

Energoatom CEO Petro Kotin said earlier this month that Russia’s state nuclear company Rosatom had sent officials to the Zaporizhzhia plant to try to take control of it.

Representatives from the International Atomic Energy Agency (IAEA) arrived in Ukraine on Tuesday to inspect the country’s nuclear facilities after Kyiv claimed that munitions stored near Chernobyl could explode.

An aerial view from a plane shows a New Safe Confinement (NSC) structure over the old sarcophagus covering the damaged fourth reactor at the Chernobyl Nuclear Power Plant during a tour to the Chernobyl exclusion zone, Ukraine April 3, 2021. Ukraine International Airlines made a special offer marking the 35th anniversary of the Chernobyl nuclear disaster. Tourists get a bird's eye view of abandoned buildings in the ghost town of Pripyat and the massive domed structure covering a reactor of the Chernobyl Nuclear Power Plant that exploded on April 26, 1986. Picture taken April 3, 2021. REUTERS/Gleb Garanich
A New Safe Confinement (NSC) structure over the old sarcophagus covering the damaged fourth reactor at the Chernobyl Nuclear Power Plant, Ukraine. (Reuters)

IAEA chief Rafael Grosso, along with various experts and equipment aimed at keeping nuclear facilities there safe, are on hand for assistance.

Since Russia’s invasion, Grossi has called on both countries to urgently agree a framework to ensure nuclear facilities are safe and secure.

Ukraine has repeatedly expressed safety concerns about Chernobyl and demanded Russian forces occupying the plant pull out of the area.

The Russian military said after capturing the plant that radiation was within normal levels and their actions prevented possible “nuclear provocations” by Ukrainian nationalists.

Russia has denied that its forces have put nuclear facilities inside Ukraine at risk.

Scientists Achieve Record Energy Efficiency for Thin Solar Panels

EcoWatch – Renewable Energy

Scientists Achieve Record Energy Efficiency for Thin Solar Panels

Paige Bennett – March 30, 2022

A disordered honeycomb layer used on top of the silicon panel

Scientists collaborated with AMOLF in Amsterdam to use solar panels one micrometer thick with a disordered honeycomb layer on top of the silicon panel. AMOLF

Scientists from the University of Surrey and Imperial College London have achieved an increase in energy absorption in ultra-thin solar panels by 25%, a record for panels of this size.

The team, which collaborated with AMOLF in Amsterdam, used solar panels just one micrometer thick with a disordered honeycomb layer on top of the silicon panel. The biophilic design draws inspiration from butterfly wings and bird eyes to absorb sunlight from every possible angle, making the panels more efficient.

The research led to a 25% increase in levels of energy absorption by the panels, making these solar panels more efficient than other one-micrometer-thick panels. They published their findings in the American Chemical Society’s journal, Photonics.

“One of the challenges of working with silicon is that nearly a third of light bounces straight off it without being absorbed and the energy harnessed,” said Marian Florescu from the University of Surrey’s Advanced Technology Institute (ATI) in a statement. “A textured layer across the silicon helps tackle this and our disordered, yet hyperuniform, honeycomb design is particularly successful.”

The panels in the study reached absorption levels of 26.3 mA/cm2, compared to a previous absorption record of 19.72 mA/cm2 from 2017.

Increasing the efficiency and absorption of ultra-thin panels is crucial to achieving low-cost photovoltaics.

“Micrometer-thick silicon photovoltaics (PV) promises to be the ultimate cost-effective, reliable, and environmentally friendly solution to harness solar power in urban areas and space, as it combines the low cost and maturity of crystalline silicon (c-Si) manufacturing with the low weight and mechanical flexibility of thin films,” the authors of the study explained.

The researchers expect that more design improvements will push the efficiency of macrometer-thin panels even higher, and they will be able to compete with existing commercial solar panels. Plus, these flexible panels could offer versatility in how they are used.

“There’s enormous potential for using ultra-thin photovoltaics. For example, given how light they are, they will be particularly useful in space and could make new extra-terrestrial projects viable,” Florescu said. “Since they use so much less silicon, we are hoping there will be cost savings here on Earth as well, plus there could be potential to bring more benefits from the Internet of Things and to create zero-energy buildings powered locally.” 

Outside of photovoltaics, the research could also be useful for other industries, like photo-electrochemistry, solid-state light emission and photodetectors, that focus on light management.

Following the successful absorption rate increase of the ultra-thin panels in this study, the scientists plan to start looking for commercial partners and develop a plan for manufacturing.

50% of U.S. Lakes and Rivers Are Too Polluted for Swimming, Fishing, Drinking

EcoWatch – Health – Wellness

50% of U.S. Lakes and Rivers Are Too Polluted for Swimming, Fishing, Drinking

Olivia Rosane – March 29, 2022

A steel mill on Indiana’s Grand Calumet River. Cavan Images / Getty Images

Fifty years ago, the U.S. passed the Clean Water Act with the goal of ensuring  “fishable, swimmable” water across the U.S. by 1983. 

Now, a new report from the Environmental Integrity Project (EIP) finds the country has fallen far short of that goal. In fact, about half of the nation’s lakes and rivers are too polluted for swimming, fishing or drinking. 

“The Clean Water Act should be celebrated on its 50th birthday for making America’s waterways significantly cleaner,” EIP Executive Director Eric Schaeffer said in a press release announcing the report.  “However, we need more funding, stronger enforcement, and better control of farm runoff to clean up waters that are still polluted after half a century. Let’s give EPA and states the tools they need to finish the job – we owe that much to our children and to future generations.”

The report was based on reports that states are required to submit under the Clean Water Act on the pollution levels of their rivers, streams, lakes and estuaries. According to the most recent reports, more than half of the lakes and rivers are considered “impaired,” meaning that they fall short of standards for fishing, swimming, aquatic life and drinking. 

Specifically, around 51 percent of rivers and streams and 55 percent of lake acres are considered impaired, The Hill reported. Further, 26 percent of estuary miles are also impaired. 

The Clean Water Act was a landmark legislative achievement when it was passed in 1972. It promised to end the discharge of all pollutants into navigable waters by 1985, according to the press release. However, it has fallen short of that goal for several reasons, according to the report. 

  1. The act has strong controls for pollution pumped directly into waterways from factories or sewage plants but not for indirect pollution such as agricultural runoff from factory farms.
  2. The Environmental Protection Agency (EPA) has dragged its feet in updating industry-specific technology-based limits for water pollution control systems. By 2022, two-thirds of these industry-specific limits had not been updated in more than 30 years.
  3. Budget cuts have hampered the ability of the EPA and state agencies to enforce the law.
  4. Permit requirements are poorly enforced.
  5. Total Maximum Daily Loads, a kind of pollution control plan, are insufficient. 
  6. There are problems effectively managing watersheds that cover two or more states. 

The report also broke down pollution by state. Indiana has the most miles of rivers and streams too impaired for swimming and recreation.

“Indiana’s waters have benefited from the Clean Water Act, but unfortunately, they also illustrate some of the gaps in the law,” Dr. Indra Frank, Environmental Health & Water Policy Director for the Hoosier Environmental Council, said in the press release. “We have seen persistent, unresolved impairments, especially for E coli bacteria in our rivers and streams, in part from industrial agricultural runoff.  And we have also seen examples of Clean Water Act permits used to send water contaminated with coal-ash into our rivers. We need to halt pollution like this.”

Florida, meanwhile, had the most lake acres impaired for swimming and aquatic life. 

“Florida’s toxic-algae crisis is the direct result of lax enforcement of phosphorus and nitrogen pollution limits in cleanup plans required by the Clean Water Act,” Friends of the Everglades Executive  Director Eve Samples said in the press release. “Because these limits rely on voluntary ‘best management practices’ and a presumption of compliance, agricultural polluters regularly exceed phosphorus runoff limits while dodging responsibility — leading to harmful algal blooms in Florida’s lakes, rivers, estuaries, and even on saltwater beaches.”

The report did propose several solutions that range from making sure that the EPA and other agencies carry out their duties under the existing law to strengthening the act with new legislation to control runoff pollution. 

This last is particularly important because agricultural runoff and other indirect pollution sources are the leading causes of waterway pollution. 

“Factory-style animal production has become an industry with a massive waste disposal problem and should be regulated like other large industries,” the study authors wrote.   

Microplastics have been found in air, water, food and now … human blood

USA Today

Microplastics have been found in air, water, food and now … human blood

Mike Snider – March 25, 2022

Powerful magnification allowed researchers to count and identify microplastic beads and fragments that were collected in 11 western national parks and wilderness areas over 14 months of sampling in a 2020 study.

Plastic – it’s in your blood. And we know so because researchers have just found microscopic plastic particles flowing in our bloodstream for the first time.

Previous research had found we inhale and ingest enough microscopic pieces of plastic to create a credit card each week. But until now, scientists didn’t know whether those particles were entering the bloodstream.

“It’s the first step for proper risk assessment … (of) the internal concentrations of plastic particles,” Dick Vethaak, professor of ecotoxicology, water quality and health at Vrije Universiteit Amsterdam, The Hague, the Netherlands, told USA TODAY. Vethaak is among the authors of a study published Thursday in the peer-reviewed journal Environment International.

Plastic particles were found in the blood of more than three-fourths (17 out of 22) of the Netherlands-based donors who participated in the study. Of course, knowing there is plastic in the blood of many people just leads to more questions for researchers to tackle.

“We have to find out where are these particles traveling. Do they accumulate in certain organs?” Vethaak said. “Are (accumulations) sufficiently high enough to trigger responses leading to diseases?”

Plastic particles can enter the body through your food and drink, the air you breathe – there are microscopic plastic bits flying around in the air – and even from the rain.

Finding signs of plastic in the blood

Researchers analyzed subjects’ blood samples for traces of the presence of different polymers, which are the building blocks of plastics. Most prominent was polyethylene terephthalate (PET), a common type of plastic used in making drink bottles, food packaging and fabrics, and even lip gloss.

The second most commonly found plastic in the samples: polystyrene, used to make a wide variety of common household products including disposable bowls, plates and food containers, and what we call styrofoam.

The third most likely plastic found in subjects’ blood was polyethylene, a material regularly used in the production of paints, sandwich bags, shopping bags, plastic wrap and detergent bottles, and in toothpaste.

Polypropylene, used in making food containers and rugs, was also found in subjects’ blood, but at concentrations too low for an accurate measurement.

Did you know?

  • Humans have produced 18.2 trillion pounds of plastics – the equivalent to 1 billion elephants – since large-scale plastic production began in the early 1950s. Nearly 80% of that plastic is now in landfills, researchers say. By 2050, another 26.5 trillion pounds will be produced worldwide.
  • Plastic flowing into the world’s oceans, rivers and lakes will increase from 11 million metric tons in 2016 to 29 million metric tons annually in 2040, the equivalent of dumping 70 pounds of plastic waste along every foot of the world’s coastline, according to research from The Pew Charitable Trusts.
  • You eat or breathe in about 2,000 tiny plastic particles each week, the World Wildlife Federation found in a 2019 study. Most are ingested from bottled water and tap water.
Powerful magnification allowed researchers to count and identify microplastic beads and fragments that were collected in 11 western national parks and wilderness areas over 14 months of sampling in a 2020 study.
Powerful magnification allowed researchers to count and identify microplastic beads and fragments that were collected in 11 western national parks and wilderness areas over 14 months of sampling in a 2020 study.

The overall concentration of plastic particles in the donor’s blood averaged 1.6 micrograms, or one-millionth of a gram – the equivalent to one teaspoon of plastic per the amount of water in ten large bathtubs, researchers say.

That’s not much, but researchers only searched for a few plastic polymers. And plastic particles may be in different concentrations in different parts of the body.

Researchers particularly wonder whether microplastics – or even smaller particles called nanoplastics – could affect the brain, digestive system and other parts of the body. Could they help cancers develop or grow?

“More detailed research … is urgently needed,” Vethaak and other researchers say in a separate article published this week in the peer-reviewed journal Exposure and Health. “The problem is becoming more urgent with each day.”

Microplastics: A problem that’s not going away

That’s because microplastics, a type of pollution, are literally everywhere, having been found from the bottom of the ocean to Mount Everest. We’ve known that fish have been ingesting them.

More foods including fruits and vegetables may contain microplastics, too. Previous research found that infants may ingest 10 times the amount of microplastics that adults do, based on a 2021 study comparing adult and infant feces. Babies could have higher microplastics exposures from bottles and baby toys, researchers suggest.

Microplastics will continue to spread because plastic production is only increasing, said Jo Royle, CEO of Common Seas, an organization targeting plastic pollution in the oceans. Common Seas, along with the Netherlands Organisation for Health Research and Development, commissioned the research. “We need to hurry up and invest in the research to be able to understand what threats plastics pose to human health,” Royle told USA TODAY.

She said her blood, and that of Vethaak’s, was analyzed and found to have plastics in the bloodstream but was not included in the published research. “To find this plastic in my blood, it is concerning,” Royle said.

With research, “we can make informed choices,” she said. “But there’s a lot of steps that we can take each day to reduce our exposure to single-use plastics and particularly food and beverage packaging.”

What climate change will mean for your home

The Washington Post

What climate change will mean for your home

Michele Lerner – March 24, 2022

What climate change will mean for your home

When Miyuki Hino bought a house in Chapel Hill, N.C., in 2020, she checked an online map that showed the damage caused by Hurricane Matthew in 2016 to evaluate the neighborhood.

“We wanted to know our flood risk before buying, although we’re aware that every storm is different and they can be hard to predict,” says Hino, an assistant professor of city and regional planning at the University of North Carolina at Chapel Hill. “We had to make an offer quickly, so we looked at the map and we asked neighbors about which houses nearby had flooded. We found out that our street is on a slight hill and the homes at the bottom of the hill had more trouble from that hurricane.”

Hino purchased flood insurance, which costs about $300 annually, even though it isn’t required for her home.

“Our first concern is for the safety of everyone in the house,” says Hino. “Our second concern is about property damage in case of a storm. But we’re also concerned about the long-term impact of extreme events on the value of our property.”


Not every buyer is as diligent about evaluating the potential risk of a weather-related disaster, but that may change in the future. Violent storms, wildfires, floods, droughts and extreme heat are among the increasingly visible signs of climate change. While safety issues associated with these events are of prime importance, the frequency and intensity of dramatic natural disasters are beginning to have an impact on property values and the cost of homeownership in some locations. Researchers are analyzing data to help buyers, homeowners, lenders, insurance companies and appraisers evaluate what the future may hold and how that could impact the housing market.

“Most homeowners should care about climate change and the potential impact on their families and property,” says John Berkowitz, CEO and founder of OJO Labs, a real estate technology firm that owns the Movoto listing site in Austin. “Unfortunately, the people who are most likely to be hurt are already disadvantaged in the housing market, such as first-time buyers and minority buyers who are focused on affordability now. They don’t have the luxury of time or money to think about what their property value will be in 2050.”

Lack of knowledge about climate risk makes it difficult for buyers to recognize that their home could be more costly to maintain, more expensive to insure, and more exposed to damage and possible destruction from a storm or fire. All those possibilities could also contribute to a decline in a property’s value or the inability to sell the home in the future. Yet few consumers consider these issues when buying a home.

Fires, floods and home values

Numerous studies have recently looked at the current impact of hazards on property values. For example, Redfin researchers found that homes in areas prone to wildfires sold for an average of 3.9 percent less compared with homes in areas with lower wildfire risk in California, Oregon and Washington state in 2020. Between 2012 and 2020, the median sales price of homes in low-risk areas increased 101 percent compared with an 88 percent increase in the median sales price for homes in areas with a high risk for wildfire, according to the study.

But home values don’t always correlate with climate risks. Hino co-wrote a report with Marshall Burke, an associate professor in the department of Earth system science at Stanford University, titled “The Effect of Information About Climate Risk on Property Values,” that focused on flood risk.

“Our research looked at the impact of regulatory flood plain maps, which are used to determine whether a home needs flood insurance, on home prices,” says Hino. “We expected to see that homes that require flood insurance would be less costly than similar homes that don’t require flood insurance, but that’s not happening.”

The main culprit is lack of information, says Hino.

“I read one study that found that less than 10 percent of buyers know that a house is in a flood plain before they make an offer,” says Hino. “They find out later when their lender checks the [Federal Emergency Management Agency] map to see if flood insurance is required.”

Homes in coastal areas that are prone to flooding are desirable to many buyers for their water views, which keeps their prices high. A 2021 study by Redfin researchers found that homes with a high risk for flooding sold for a premium of 13.6 percent more than homes with a low risk for flooding during the first quarter of 2021, an increase in that premium over both 2020 and 2019.

Unfortunately, FEMA maps have been found to underestimate flood risk. A study by the nonprofit First Street Foundation found that more than 23.5 million properties are at risk of flooding over the next 30 years. First Street Foundation’s Flood Factor tool, which is available to consumers, includes flood risk from urban storm water flooding, storm surge and future conditions such as rising seas.

Mortgage lenders and insurance companies rely on FEMA maps to evaluate flood risk and to inform consumers about the requirement or recommendation for flood insurance. Flood damage is not covered by regular homeowners insurance policies and therefore requires a separate policy. The Research Institute for Housing America (RIHA) at the Mortgage Bankers Association released a study earlier this year – “The Impact of Climate Change on Housing and Housing Finance” – that concluded that the housing industry lacks an accepted indicator to assess climate risk.

“There’s lots of work to do in the industry because there’s no single test for climate projections that lenders can use for risk management,” says Eddie Seiler, executive director of RIHA in D.C. “There are private companies working to build models to understand the risks to homeowners and the financial risks to lenders. Freddie Mac and Fannie Mae are working to come up with climate scenarios, too.”

Seiler says he believes that eventually climate risk may become part of the mortgage underwriting process. The report found that, in addition to increased flood risk and property damage, climate change may increase mortgage default rates, increase the volatility of house prices and possibly produce climate-related migration patterns. If people choose to move away from areas with high risks from fires, floods and storms, that could reduce property values in those communities.

“After Hurricane Katrina, the mortgage industry didn’t know whether borrowers would default on their loans,” says Seiler. “The FEMA maps were way out of date, so people who were at high risk for floods didn’t know it and didn’t have flood insurance. In that case, the federal government stepped in. But we know that when people are underwater on their loans, they default more often.”

Another risk is that if insurance rates skyrocket, the cost of having a home would be so high that owners would be unable to repay their loans, Seiler says.

“Insurance companies raise rates as much as 20 or 30 percent in high-risk areas compared to low-risk areas,” says Brian O’Connell, a senior insurance analyst at InsuranceQuotes.com in Bucks County, Pa. “Buyers should expect to see rates increase as we see more floods, fires and heat waves. Alternatively, some insurance companies may simply get out of the business, which could also increase costs because of the lack of competition for customers.”

Some insurance companies also raise the deductible for specific events such as hurricanes, which leaves homeowners responsible for thousands of dollars of repair costs, according to O’Connell.

Consumers and climate risk

The unpredictability of climate change makes it difficult to evaluate the risk for a specific event to occur at any particular property. Even wildfires sometimes skip over some homes. Hurricanes and tornadoes have uneven impacts on homes within the same neighborhood.

Another obstacle for home buyers is that seller disclosure rules vary by jurisdiction. Sellers are not always required to share information about risks associated with natural disasters or previous damage.

“We found that in states with stricter disclosure laws there was a higher correlation between pricing and flood insurance,” says Hino. “In states such as Louisiana, Texas, Oklahoma and South Carolina, home prices are lower on homes that carry a risk of floods because buyers are aware of the risk.”

One solution is to provide data about possible future increases in storms and extreme heat directly to buyers and to real estate agents who can share that information with house hunters, says Berkowitz. Movoto includes information on climate risk for each listing on their site from ClimateCheck.

“Consumers can look now at listings on sites such as Redfin and Realtor.com for flood risk scores and climate scores,” says Seiler. “That helps to get people thinking earlier about the potential risk from floods, fires and storms.”

Consumers can also go directly to sites such as ClimateCheck, Flood Factor, Attom Data Solutions Home Disclosure Report and CoreLogic’s RiskMeter to review hazard risks that include storms, floods and wildfires.

“We’re working with climate scientists to develop analytics on what climate change means, such as whether there will be more hurricanes or stronger hurricanes and whether the issue will be storm surges or high winds,” says Tom Larsen, principal for insurance and spatial solutions at CoreLogic, a data analytics firm based in Irvine, Calif. “The challenge with these perils is that you don’t see identical damage to each house. So we use our spatial modeling to look on a granular level at every house. We can look at the elevation above the sea level of the first floor of a house and follow wildfire patterns property by property.”

Since CoreLogic primarily provides analytics to industry professionals such as insurance companies and lenders, its focus is on what it would cost to repair or rebuild a property. Mortgage and insurance companies need the information because of their financial commitment to the property.

“Consumers want to know if their home will lose value, but it’s tough to evaluate the market price of a property versus the physical cost of rebuilding,” says Larsen. “But consumers also need to know their total cost to live in a home. Eventually, I think predicting insurance costs based on climate risk will become part of the mortgage process because it’s part of the cost of ownership.”

For buyers today, assessing the potential cost from climate risk is one more thing to pay attention to and is challenging to evaluate, says Larsen.

“Eventually, we’ll get to the point where people can see an average score that demonstrates what the risk is now, the expected cost of possible damages and a prediction of future potential costs,” says Larsen. “That’s not necessarily to tell someone not to buy someplace, but to help them understand the risk they’re accepting by buying in certain locations.”

O’Connell recommends hiring a good buyers’ agent who will warn consumers about high insurance costs or elevated risk for natural disasters.

“Buyers should do their due diligence and check insurance premiums ahead of time for different areas, so they understand what they’re getting into if they choose to buy near water, for example,” says O’Connell. “They should also read their insurance policy, so they know what happens if there’s a weather event and to make sure they’re covered for a wildfire or wind damage. If they’re not comfortable reading it, they should ask a lawyer to review it or talk to an insurance expert.”

Buyers may want to factor in costs related to adapting their homes for climate change, says Berkowitz.

“For example, homeowners in places that are beginning to see more severe winters need to consider the cost of winterizing their homes with more insulation and better windows,” Berkowitz says. “Homeowners in traditionally cooler climates like Seattle are finding themselves investing in air conditioning now that the summers are hotter there.”

Climate awareness has received a low level of attention so far, but that won’t last forever, especially as climate risk increases, Berkowitz asserts.

However, Berkowitz acknowledges, it’s hard to predict whether climate change will decrease the desirability of homes in some areas because of safety issues or because of the higher cost of ownership. It could just mean that homes in some areas appreciate less over the next 30 years than they did over the previous 30 years.

“Home buyers and owners need to recognize the value of their house today and understand how it could change in the future,” says Berkowitz. “They need to be aware of the full cost of ownership, including maintenance and insurance and how those costs could rise.”

How to evaluate climate risk when house hunting

Check all listings on sites such as Realtor.com, Movoto and Redfin for information about climate-related risks such as floods and fires.

Ask neighbors about recent storms and damage.

Ask your real estate agent for information about floods, fires and storms in the area.

Check the address of a property on sites such as ClimateCheck, Flood Factor, Attom Data Solutions Home Disclosure Report and CoreLogic’s RiskMeter.

Depending on the local disclosure laws, ask the seller and listing agent for information about previous flood or fire damage.

Request a homeowners insurance estimate as early as possible to determine affordability.

Ask a home inspector to look for evidence of previous storm or fire damage.

Find out if storm-resistant features have been added to the house, such as hurricane shutters, stronger windows and mesh coverings for vents in fire-prone areas. If not, ask for a cost estimate to add those features.

Ask if the community is taking steps to mitigate storm risk.

What Are Food Miles?

EcoWatch – Food

What Are Food Miles?

Linnea Harris – March 23, 2022 

A big rig semi truck transports boxes of pears.

A big rig semi truck transports boxes of pears. vitpho / iStock / Getty Images Plus

Food travels long distances – sometimes hundreds or thousands of miles – to reach our plates. Mapping the trajectory of many processed foods is to draw zig-zags across the globe, connecting faraway fields, factories, distribution centers, and store shelves. 

The concept of “food miles” was created in the 1990s to warn consumers of the connection between long-distance food transportation and mounting global carbon emissions. Recent estimates figure that, in the U.S., processed food typically travels over 1,300 miles and fresh produce over 1,500 miles before it’s consumed. Ultimately, the further food travels, the more fossil fuels are needed, which in turn results in more greenhouse gas emissions that fuel climate change.

Just 10 companies – among them Nestlé, Mondolez, and Unilever – control almost all large food and beverage companies in the world. This concentration of food suppliers has left less room for small, local farmers, and means more and more of our food is transported across the country – or the globe – before being eaten. Take Iowa, for example: in 1870, 100% of all apples consumed in the state were also produced there. By the end of the 20th century, however, only 15% of apples consumed were produced by Iowa farmers

The globalization of our food supply has also allowed consumers to become accustomed to foods grown only in other regions – think of coffee, which isn’t grown anywhere in the contiguous U.S. – or out-of-season foods that must be transported from warmer climates. Strawberries bought at a local farmers market during their summer growing season, for example, will have a lower food mileage than those shipped from California and purchased at a grocery store in December. 

Different methods have been employed over time to calculate food miles. The Weighted Average Source Distance (WASD) formula was developed by Annika Carlsson-Kanyama in 1997, and considers the weight of the transported food and the distance it travels from the place of production to the place of sale. To analyze foods with multiple ingredients – including many processed foods, like bread, packaged desserts, snacks, etc. – The Leopold Center for Sustainable Agriculture developed the Weighted Total Source Distance (WTSD) formula, which calculates the weight and distance traveled of each individual ingredient. 

How Are Food Miles Calculated? 

The WASD and WTSD are helpful formulas, but the Weighted Average Emissions Ratio (WAER) formula – developed in 2004 by the nonprofit LifeCycles – also takes into account the greenhouse gas emissions associated with the mode of transportation employed. So, it’s not just the literal miles traveled that matters, but the means by which it’s transported.

The Impact of Faraway Food

Both transportation and agriculture are major culprits in human-caused climate change. In the U.S., transportation accounts for the largest share of national greenhouse gas emissions, and, according to the IPCC, agriculture accounts for one-fifth of all global CO2 emissions. The U.S. food system alone consumes more energy than all of France annually. 

Within the food system itself, transportation comprises 14% of all energy used, but greenhouse gas emissions are also related to where the food was produced: The Leopold Center found that conventionally-sourced food uses 4 to 17 times more fuel than local food, and produces 5 to 17 times more CO2. For processed foods, the impact is even larger. Think of a frozen lasagna: the wheat for the pasta might be grown in Kansas, the tomatoes and spinach for the sauce in California, the beef raised in Texas, and the cheese made in Wisconsin. Some of these materials might even need to be transported from the farm to another location to be processed – like the wheat to be made into sheets of lasagna noodles – then to the factory to be assembled, packaged, and finally shipped to grocery stores. 

Food miles also take into account the mode of transportation used – by water, road, rail, or air, in order of efficiency – which are not all created equal; transporting food by plane creates 50% more greenhouse gas emissions than food transported by sea. A 2005 study found that while air transportation only accounts for 1% of food transportation in the UK, it is responsible for 11% of the country’s emissions. 

Food mileage should also include how the food is procured by the customer. In our car-based society, where car-ownership rates by household have remained above 90% for a decade, many shoppers drive to a store to purchase their groceries. In 2015, researchers found that the median distance to the nearest food store for Americans was 0.9 miles, and that 40% of the population lived further than 1 mile from a food store, necessitating a car for many people in order to do their shopping. 

Debate Over Food Miles 

Climate and agricultural scientists don’t all agree on the benefits or accuracy of food miles when determining the environmental impact of food products. 

Many argue that this metric doesn’t take into account the whole carbon footprint of an item, or its non-emissions-related environmental impacts during production, like pesticide use, water pollution, or farmers’ rights. “Working out carbon footprints is horribly complicated,” said African agriculture expert professor Gareth Edwards-Jones of Bangor University in an interview with The Guardian. “It is not just where something is grown and how far it has to travel, but also how it is grown, how it is stored, how it is prepared.”

Local food is often espoused as the greener option, but this isn’t always true. For example, the energy needed to heat a greenhouse in the Northeast to grow tomatoes in the winter might actually be a more carbon-intensive process than shipping the tomatoes from California. A Swedish study found that tomatoes imported to Sweden from Spain were actually less energy-intensive than those grown locally in greenhouses. 

Some companies and organizations have instead begun using the Life Cycle Assessment (LCA) method to analyze the impact of their product. This method takes into account all stages in the life cycle of the product, from production, to processing, to packaging, to transportation, to disposal. The analysis goes beyond carbon emissions and considers other environmental factors like air and water pollution, use of natural resources, and impacts on human health. 

How to Reduce Food Miles

While the benefit of food miles might be contested, lowering your environmental impact with your food choices is always beneficial. 

To find the food miles of your favorite products, use this food miles calculator, or research where the product comes from. It might be unrealistic to expunge all faraway foods from your diet – given expense and convenience – but some items might be replaceable with local alternatives. Consider joining a CSA to get fresh produce from nearby farms at regular intervals, or shopping from local producers at a farmers market. Better yet, grow your own food! The only food miles to calculate will be the distance from your backyard or front stoop to your kitchen.

Eating seasonal produce will also ensure that your produce wasn’t shipped across the country to reach your plate. While you can’t always know if something was transported by plane, many perishables that need to be eaten quickly after harvesting are – like berries – so refraining from eating these products until they’re in season will cut down on air transport.

Beyond food miles, minimize your impact by cutting down on food-related emissions in other ways. Limiting or cutting out meat and dairy is among the most impactful of changes, as 57% of emissions from food production are attributed to animal-based food (including the production of livestock feed). Going fully vegan or vegetarian is great, but not imperative; just reducing animal products in your diet makes a difference. Lastly, instead of tossing food scraps in the trash, compost them at home to keep organic waste out of landfills. 

Half of US adults exposed to harmful lead levels as kids

Associated Press

Half of US adults exposed to harmful lead levels as kids

By Drew Costley – March 7, 2022

Over 170 million U.S.-born people who were adults in 2015 were exposed to harmful levels of lead as children, a new study estimates.

Researchers used blood-lead level, census and leaded gasoline consumption data to examine how widespread early childhood lead exposure was in the country between 1940 and 2015.

In a paper published in the Proceedings of the National Academy of Sciences on Monday, they estimated that half the U.S. adult population in 2015 had been exposed to lead levels surpassing five micrograms per deciliter — the Centers for Disease Control and Prevention threshold for harmful lead exposure at the time.

The scientists from Florida State University and Duke University also found that 90% of children born in the U.S. between 1950 and 1981 had blood-lead levels higher than the CDC threshold. And the researchers found significant impact on cognitive development: on average, early childhood exposure to lead resulted in a 2.6-point drop in IQ.

The researchers only examined lead exposure caused by leaded gasoline, the dominant form of exposure from the 1940s to the late 1980s, according to data from the U.S. Geological Survey. Leaded gasoline for on-road vehicles was phased out starting in the 1970s, then finally banned in 1996.

Video: Nearly 50% of bald eagles have chronic lead poisoning

Study lead author Michael McFarland, an associate professor of sociology at Florida State University, said the findings were “infuriating” because it was long known that lead exposure was harmful, based on anecdotal evidence of lead’s health impacts throughout history.

Though the U.S. has implemented tougher regulations to protect Americans from lead poisoning in recent decades, the public health impacts of exposure could last for several decades, experts told the Associated Press.

“Childhood lead exposure is not just here and now. It’s going to impact your lifelong health,” said Abheet Solomon, a senior program manager at the United Nations Children’s Fund.

Early childhood lead exposure is known to have many impacts on cognitive development, but it also increases risk for developing hypertension and heart disease, experts said.

“I think the connection to IQ is larger than we thought and it’s startlingly large,” said Ted Schwaba, a researcher at University of Texas-Austin who studies personality psychology and was not part of the new study.

Schwaba said the study’s use of an average to represent the cognitive impacts of lead exposure could result in an overestimation of impacts on some people and underestimation in others.

Previous research on the relationship between lead exposure and IQ found a similar impact, though over a shorter study period.

Bruce Lanphear, a health sciences professor at Simon Fraser University in Vancouver who has researched lead exposure and IQ, said his 2005 study found the initial exposure to lead was the most harmful when it comes to loss of cognitive ability as measured by IQ.

“The more tragic part is that we keep making the same … mistakes again,” Lanphear said. “First it was lead, then it was air pollution. … Now it’s PFAS chemicals and phthalates (chemicals used to make plastics more durable). And it keeps going on and on.

“And we can’t stop long enough to ask ourselves should we be regulating chemicals differently,” he said.

The Associated Press Health and Science Department receives support from the Howard Hughes Medical Institute’s Department of Science Education. The AP is solely responsible for all content.

Ocean’s Largest Dead Zones Mapped by MIT Scientists

EcoWatch – Oceans

Ocean’s Largest Dead Zones Mapped by MIT Scientists

 Olivia Rosane – January 26, 2022

MIT scientists have generated an atlas of the world’s ocean dead zones.

Oxygen-deficient zones intensity across the eastern Pacific Ocean, where copper colors represent the locations of consistently lowest oxygen concentrations and deep teal indicates regions without sufficiently low dissolved oxygen. Jarek Kwiecinski and Andrew Babbin

When you think of the tropical Pacific, you might picture a rainbow of fish ribboning their way between pinnacles of coral, or large sea turtles swimming beneath diamonds of sunlight. But there are two mysterious zones in the Pacific Ocean where life like this cannot survive. 

That is because they are the two largest oxygen-deficient zones (ODZ) in the world, which means they are a no-go zone for most aerobic (oxygen-dependent) organisms. Two Massachusetts Institute of Technology (MIT) scientists recently succeeded in making the most detailed atlas to date of these important oceanic regions, revealing crucial new facts about them in the process. The new high-resolution atlas was described last month in the journal Global Biogeochemical Cycles

“We learned just how big these two zones in the Pacific are, reducing the uncertainty in the measurement, their horizontal extent, how much and where these zones are ventilated by oxygenated waters, and so much more,” Andrew Babbin told EcoWatch in an email. Babbin is one of the atlas’s two developers and Cecil and Ida Green Career Development Professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “Being able to visualize in high resolution the low oxygen zones really is a necessary first step to fully understanding the processes and phenomena that lead to their emergence,” he said.

Natural Dead Zones

Oxygen-deficient zones can also be referred to as hypoxic zones or dead zones, as the National Oceanic and Atmospheric Administration explains. They can be caused by human activity, especially nutrient pollution. For example, the world’s second-largest dead zone is in the Gulf of Mexico, and is largely caused by the runoff of nitrogen and phosphorus from cities and factory farms.

The new atlas focuses on two naturally-occurring ODZs in the tropical Pacific, however. One is located off the coast of South America and measures about 600,000 cubic kilometers (approximately 143,948 cubic miles), or the equivalent of 240-billion Olympic swimming pools, MIT News reported. The second is around three times larger and located in the northern hemisphere, off the coast of Central America. 

Both natural and anthropogenic ODZs have something in common: too many nutrients. In the case of the Pacific ODZs, Babbin said, those nutrients build up because of wind patterns that push water offshore. 

“Deeper water then upwells to fill in this void, bringing higher nutrients to the surface,” Babbin told EcoWatch. “Those nutrients stimulate a massive amount of growth of phytoplankton, akin to how we fertilize crop lands and even our potted plants at home. When those phytoplankton then sink, heterotrophic bacteria act to decompose the organic material, consuming oxygen just like humans do to respire our food.” 

However, because of where these zones are located, it takes a long time for oxygen-rich waters to reach the area and replenish what the bacteria gobble up.

“In essence, the biological demand of oxygen outpaces the physical resupply,” Babbin concluded. 

While these specific zones aren’t caused by human pollution, understanding them is still important in the context of human activity. ODZs can emit the greenhouse gas nitrous oxide, and there is a concern that the climate crisis may cause them to expand.