Op-Ed: L.A. ports can’t follow business as usual. Our shipping system is unsustainable

Los Angeles Times

Op-Ed: L.A. ports can’t follow business as usual. Our shipping system is unsustainable

Christina Dunbar-Hester – January 30, 2023

San Pedro, CA - August 25: An aerial view of the The Port of Los Angeles in San Pedro, Thursday, Aug. 25, 2022. The Port of Los Angeles is the nation's gateway for international commerce and is the busiest seaport in the Western Hemisphere. Located in San Pedro Bay, the Port stretches along 43 miles of waterfront. The Vincent Thomas Bridge, a 1,500-foot-long suspension bridge, crosses Los Angeles Harbor in Los Angeles, and links San Pedro with Terminal Island. (Allen J. Schaben / Los Angeles Times)
The Port of Los Angeles, shown last August, is the busiest seaport in the Western Hemisphere. (Allen J. Schaben / Los Angeles Times)

Ports in the Los Angeles region entered national headlines as a supply chain crisis unfurled during the pandemic. After an initial near-halt to commerce and shipping in early 2020, some of us saw bluer skies and enjoyed cleaner air for a fleeting moment.

But by 2021, consumer purchasing skyrocketed and trade came roaring back. Though that might sound good for business, it’s a status quo in which the L.A.-Long Beach port complex is Southern California’s largest single source of pollution. If California wants to live up to its reputation as an environmental leader, port operations require more scrutiny — and change.

Though the ports were built to transport general goods and commodities, their fate has been particularly tied to fossil fuels. The rise of oil from the 1920s onward spurred their development to handle a large volume of petroleum. The wealth this generated was poured back into the ports themselves, intensifying the scale of trade. Combined, Los Angeles-Long Beach makes up the largest container port complex in the Western Hemisphere, through which goods — especially from Asia— reach warehouses, retail shelves, e-commerce fulfillment centers and ultimately consumers’ homes.

The pandemic dramatically illustrated the scope of this economic engine. A spike in consumer demand coincided with labor interruptions and other snarls to supply chains, exemplified by the logjam off the coast of Southern California where dozens of ships queued waiting to dock. Residents, especially those living near the ports and distribution corridors, breathed in sharply elevated air pollution.

To preempt future disruptions, state and local officials and the Biden administration have moved to streamline and expand goods-handling in the last couple years. Biden announced that hours of port operation would be extended to keep cargo movement humming. The Port of Long Beach unveiled a new bridge built to allow larger ships’ passage (even as seas rise), and it received federal authorization to deepen its shipping channels. Local officials now fret about whether ports on the East Coast and the Gulf of Mexico will snatch away a significant share of cargo business because of disruptions in Southern California.

Economic concerns are understandable, especially since the ports are associated with thousands of jobs. But building bigger operations to move an ever-increasing volume of goods is short-sighted locally and globally. Massive ships create infrastructure demands at odds with our need to reduce carbon emissions, curb resource extraction and control environmental pollutants. Many shipped consumer goods are bound for landfills after only a very short period of use. Apparel, appliances, electronics and furniture have shorter lifespans than they did a few decades ago. The way we consume goods right now is simply not sustainable.

Meanwhile, officials and regulators have been sharply criticized for delaying measures to safeguard health for communities around the ports. As air quality activists note, cutting port emissions is urgently needed. Electrifying port and warehouse equipment is underway, but long-haul journeys, including ocean shipping itself and truck distribution, also need to transition off fossil fuel — cargo ship fuel is even dirtier than the diesel on which trucks run — and meet much lower emissions targets. San Pedro Bay’s port complex also traffics a large volume of fossil fuels in addition to consumer goods. Petroleum handling in the ports will need to be significantly diminished to meet the challenge of climate change.

The ports play a substantial role in the interlocking crises in our region, which require an expansive vision. After decades of improvement, air pollution is rising again, due to not only transportation and industrial emissions but also to bigger wildfires, which are the result of  rising temperatures. Global shipping at scale also contributes to the erosion of Indigenous sovereignty by encouraging extractive practices that degrade land, which in turn drives global warming and a related biodiversity and extinction crisis.

How California tackles these threats will have effects far beyond our stateGov. Gavin Newsom’s “30×30” plan — which made California the first state to commit to conserving 30% of its land by 2030 — will provide wildlife habitat that can help absorb carbon. Yet conservation cannot absolve California of its lethal industrial areas. We must approach even freight corridors as spaces for people and nature rather than “sacrifice zones” where toxic exposure is accepted as necessary for industrial activity.

As Angelenos, we should be planning for a future where the success of the ports and the region is not measured by year-over-year growth in goods movement. Indeed, a more livable future in this region might see the ports planning for fewer ships and fewer goods, handled more slowly and accompanied by good jobs in cleaner energy, environmental stewardship and remediation of contaminated sites.

A just energy transition will require that we examine every part of business as usual. That means reconsidering how we’ve managed the ports for the past century. We should be reimagining their role in a more democratic, far less fossil-fuel-dependent future.

Christina Dunbar-Hester is a communication professor at the Annenberg School for Communication and Journalism at USC, a current member of the Institute for Advanced Study and the author of “Oil Beach.”

Family of Newlywed and Activist Decapitated at Utah’s Arches National Park Awarded More Than $10M

People

Family of Newlywed and Activist Decapitated at Utah’s Arches National Park Awarded More Than $10M


Melissa Montoya – January 30, 2023

A federal judge awarded more than $10 million to the family of a Ugandan human rights activist who was decapitated while on a visit to Arches National Park in 2020.

Esther “Essie” Nakajjigo’s husband Ludovic Michaud will receive $9.5 million while her mother Christine Namagembe will receive $700,000, according to the judgment filed in federal court. Essie’s father John Bocso Kateregga will receive $350,000.

Nakajjigo’s husband and parents filed a $270 million administrative claim against the National Park Service in 2021 over her death.

Nakajjigo and Michaud spent June 13, 2020, at Arches National Park in Utah as a way to celebrate their one-year anniversary of when they first met, according to the Associated Press.

The newlyweds were on their drive out with Nakajjigo in the passenger seat when a strong wind pushed the park’s entrance gate into the road, and sliced through their rental car “like a hot knife through butter,” the claim said, according to the AP.

The activist was decapitated.

Zoe Littlepage, a lead attorney on the case, told The Salt Lake Tribune, that “on behalf of the family, we are very appreciative of the judge’s attention to detail, the time he spent working on this, and for the value he put on the loss to this family of Essie.”

Esther Nakajjigo
Esther Nakajjigo

Esther Nakajjigo/Twitter Esther Nakajjigo

In a statement to the newspaper, U.S. Attorney for the District of Utah Trina Higgins, said Nakajjigo’s family was entitled to damages.

The trial began Dec. 5 in Utah and was meant to determine how much money was owed to the family, according to the Salt Lake Tribune.

During the trial, a U.S. attorney representing the government said, “The United States was 100 percent at fault. … And we want to express on behalf of the United States our profound sorrow for your loss,” per the newspaper.

RELATED: Boy, 14, Killed at North Carolina Rodeo During First Bull Ride: ‘My Lil Cowboy’

“We respect the judge’s decision and hope this award will help her loved ones as they continue to heal for this tragedy,” the statement read. “On behalf of the United States, we again extend our condolences to Ms. Nakajjigo’s friends, family and beloved community.”

“Essie was a remarkable humanitarian and champion for women and girls. This verdict, though the largest by a federal judge in Utah history, cannot replace the immeasurable loss suffered by her husband and family. We are grateful that Judge Jenkins honored Essie’s life and legacy with this award,” Littlepage said in a statement to PEOPLE.

Higgins did not immediately return PEOPLE’s request for comment.

Nakajjigo was Uganda’s ambassador for women and girls, and ran a health center in her home country that she set up when she was just 17 years old to provide free health services to adolescents.

She was also the brains behind two reality TV shows that aimed to empower young mothers and encourage girls to stay in school.

She reportedly moved to Colorado for a social entrepreneurship program at the Watson Institute in Boulder.

Absence from work at record high as Americans feel strain from Covid

The Guardian

Absence from work at record high as Americans feel strain from Covid

Melody Schreiber – January 29, 2023

<span>Photograph: Jae C Hong/AP</span>
Photograph: Jae C Hong/AP

For many Americans it feels like everyone is out sick right now. But there is a good reason: work absences from illness are at an all-time annual high in the US and show few signs of relenting. And it’s not just acute illness and caregiving duties keeping workers away.

About 1.5 million Americans missed work because of sickness in December. Each month, more than a million people have called out sick for the past three years. About 7% of Americans currently have long Covid, which can affect productivity and ability to work, according to the Centers for Disease Control and Prevention (CDC).

Related: China claims Covid wave has peaked with severe cases, deaths falling fast

The last time the absentee number dipped below a million Americans was in November 2019.

Last year, the trend accelerated rather than returning to normal. In 2022, workers had the most sickness-related absences of the pandemic, and the highest number since record-keeping began in 1976.

In 2022, the average was 1.58 million per month, for a total of 19 million absences for the year. The largest spike was in January 2022, when 3.6 million people were absent due to illness, about triple the pre-pandemic number for that month.

Parents and caregivers also saw the highest rates of childcare-related absences of the entire pandemic in October 2022 as illnesses surged amid relaxed precautions and lower vaccination rates among children.

Patterns in absenteeism correspond with rises and falls in the spread of Covid. But long Covid is probably contributing to sick leave rates as well.

One analysis in New York found that 71% of long Covid patients who filed for worker’s compensation still had symptoms requiring medical attention or were unable to work completely for at least six months. Two in five returned to work within two months, but still needed medical treatment. Nearly one in five (18%) of claimants with long Covid could not return to work for a year or longer after first getting sick. The majority were under the age of 60.

Workforce participation has dropped by about 500,000 people because of Covid, according to one study that looked over time at workers who were out sick for a week. But the actual number could be higher, because not all workers are able to take time off during their illnesses, Bach said.

“It’s likely that long Covid is keeping somewhere around 500,000 to a million full-time-equivalent workers out of work,” said Katie Bach, a nonresident senior fellow at the Brookings Institution.

Some affected by long Covid have reduced their hours, while others have left the workforce temporarily or permanently – a metric not captured by work absence data, but calculated in labor participation statistics.

Patients who are very sick with long Covid often “try to work for some amount of time and then eventually they drop out”, Bach said.

Between death and disability, the workforce has been reduced by as much as 2.6% during the pandemic, with 1bn days of work lost, McKinsey recently reported.

Those who stay in their jobs may need more sick leave than before because of new chronic illnesses.

“People who are on the less-sick end of long Covid, maybe they can keep working, but every now and then they might need a day or two off just because they have overdone it or something happened that triggered a symptom flare,” Bach said.

Nearly one in five Americans developed long Covid after their initial infection, with some 7.5% of all American adults currently experiencing long Covid, according to the CDC. The CDC began collecting data on how many people have long Covid in 2022.

Much more research still needs to be done on the causes of and treatments for long Covid, the researchers said. Some patients do eventually recover, for instance, but it’s not clear why or how long they will be sick.

“We don’t know how long it’s taking them to recover. There’s a lot of uncertainty there,” said Alice Burns, associate director of the program on Medicaid and the uninsured at the Kaiser Family Foundation.

The more immunity people have, from vaccines and recovery from prior cases, the less likely they are to get sick in the first place, which reduces the risk of developing long Covid. But it is still possible to have long Covid even after mild or asymptomatic infection.

All of this means the US may continue to see higher-than-normal workplace absences.

“Some people just really need flexibility from their employers,” Burns said. That can include telework, unscheduled leave, flexible schedules and reduced hours.

“The challenge with that is, those supports are a lot more likely to be available to workers who have office jobs, higher-paying jobs, who are pretty well-established in the labor market,” Burns said.

“Covid in general, and long Covid too, are more likely to affect people who are minorities, who have lower levels of education, [who have] likely lower levels of income. So there may be, for many people, a mismatch between the people who need some of these employment-related supports and the types of jobs they are in.”

Employers can adjust to this new normal by offering as many accommodations as possible, both for those suffering initial bouts of Covid infection and those experiencing longer-term symptoms, Bach said. Again, some of the jobs where people are most at risk might be the least accommodating – it’s usually easier for office workers to telecommute than it is for fast-food workers – but there are still steps employers can take.

“Companies have to get creative, like: can we offer more frequent breaks?” Bach said. “Can we as a society convince Medicare and Medicaid to reimburse a little bit more where companies are employing people with long Covid? What memory aids can we put together?”

If long Covid continues to affect 7% of the country, that’s 23 million people at any given time who may require accommodations under laws like the Americans with Disabilities Act.

“But there isn’t a lot of clarity about what is a reasonable accommodation” under the law when it comes to Covid and long Covid, Burns said.

While Covid has thrown the country into disarray in every realm, including work, it is also shining a more intense light on the ways chronic illness affects productivity and workforce participation – a change that disability and chronic illness activists say is long overdue, Bach pointed out.

“My hope is that it’s big enough that we can rethink how we research and treat these diseases, and how we approach workplace accommodation,” Bach said. “In a world where any of your workers could suddenly become disabled, I think you have to be more flexible.”

‘I use it because it’s better’: why chefs are embracing the electric stove

The Guardian

‘I use it because it’s better’: why chefs are embracing the electric stove

Whitney Bauck – January 29, 2023

The evidence that gas stoves are bad for human health has grown so staggering over the last few years that the US Consumer Product Safety Commission recently announced that it would consider banning the appliances. Though a conservative backlash prompted the White House to rule out the possibility of a nationwide ban, and some states have passed pre-emptive laws that prohibit cities from ever passing gas bans, other cities including Berkeley, New York and San Francisco have already moved to bar new gas hookups due to health and environmental concerns.

Related: Are gas stoves really dangerous? What we know about the science

One study from earlier this month found that one in eight cases of childhood asthma in the US is caused by gas stove pollution. According to the lead author on the study, Talor Gruenwald, a research associate at the non-profit Rewiring America, that means that living in a home with a gas stove is comparable to living in a home with a smoker. Gas stoves release pollutants so harmful that the air pollution they create would be illegal if it were outdoors, and that’s not just true when you’re actively cooking – gas stoves continue to emit harmful compounds like methane even when turned off. Beyond the adverse health impacts, those emissions are greenhouse gasses that also contribute to the climate crisis.

But solutions are within reach. “The most surefire way to eliminate risk of childhood asthma from gas stoves is to move to a clean cooking alternative like an induction stovetop or electric stovetop,” said Gruenwald.

Switching over to electric isn’t just a boon to your health and the planet – it also makes for a better cooking experience, according to a growing number of professional chefs. Read on to hear from three who have embraced electric and are loving the results.

Jon Kung: wok cooking that’s ‘more of an authentic experience’

Though he may be best known these days for TikTok videos showing off his kitchen prowess, deadpan humor and the occasional thirst trap, Jon Kung had been working as a chef professionally for more than a decade before pandemic lockdowns prompted him to start posting cooking videos on the internet. He was first introduced to induction cooking, which uses a magnetic field to efficiently heat pots and pans, while working in a commercial kitchen in Macau, China. He began relying heavily on induction burners in his current home of Detroit, Michigan, because he was often working pop-ups in spaces with limited ventilation.

“There was no altruistic intent in my decision to adopt induction. I use it because it’s better,” he said. “Induction stovetops are easier to clean, they’re more responsive, and they are just as powerful, if not more powerful, than gas. My induction burner can boil eight quarts of water within 11 minutes – it’s super fast.”

These days, Kung uses induction “100% of the time”. He often works on an induction wok, which features an induction cooktop with a bowl-shaped surface that a wok perfectly fits into, and rejects the critique that gas stove bans would prohibit chefs from cooking Chinese food authentically.

“You can buy a curved induction wok burner specifically made for woks and it works better than cooking on a wok on a western gas range,” he said. “That wok burner was literally made by Chinese people to cook Chinese food – when I cook in that it’s more of an authentic experience than cooking on a KitchenAid or a Viking range could ever be.”

Still, Kung admitted that there will be a learning curve for chefs when they initially make the switch. The biggest difference, he noted, is that gas stoves offer both “visual and tactile” feedback about how hot the cooking surface is, while induction cooktops require users to rely on numbers on a screen to know what temperature they’re working with. He recommended cooking with eggs when you’re first switching over to quickly get the kind of visual feedback that will help you learn to use an induction burner.

And for the small handful of dishes that truly require fire – think crème brûlée or charring peppers – he keeps a blowtorch in his kitchen. “I think flame should be a seldomly used tool for specific purposes in my kitchen, instead of putting my health at risk all the time because of these few times I need to actually use fire,” he said.

Christopher Galarza: quicker, easier to clean and a low barrier to entry

Christopher Galarza spent a decade working in conventional kitchens before he had his first experience in an all-electric commercial kitchen as an executive chef at Chatham University, a Pittsburgh institution known for its focus on sustainable food systems. Going electric changed his and his staff’s experience of working in the kitchen, partly because working with gas stoves can be a sweltering experience.

“I had a meat thermometer in my chef coat at one old restaurant job, and I looked down one day and noticed that my thermometer read 135F,” he said. In contrast, the all-electric kitchen he worked in at Chatham stayed pleasantly in the low 70s even on summer days when it was 90 degrees outside and the kitchen was in full production mode. “We were able to drastically reduce the temperature in the kitchen, which made us all more comfortable,” he added. “And for me personally, I can tell you that my mental health was better.”

He’s convinced that’s a benefit that got passed along to the guests eating the food he was cooking. “People can feel when you’re stressed,” he said, “and they can tell when you’re relaxed and happy.” But there was also a benefit to the bottom line, in that induction stoves are much quicker and easier to clean, which allowed him to spend less money on harsh cleaning chemicals and to send his kitchen staff home earlier while the “dollar per labor hour went way up”.

He cites other studies showing that the utility costs of operating a gas-powered or electric-powered kitchen are pretty similar, and notes that even for home chefs, the barrier to entry is low: “You can go on Amazon and buy an induction burner for $60 that plugs into the same outlet that you have your coffeemaker in,” he said.

Galarza is so convinced that electric is the future of professional cooking that he’s started a consultancy to help other kitchens make the switch. “Every international culinary competition in the world, from the Bocuse d’Or to the Culinary Olympics, is all electric,” he said. “The metric by which the international cooking community judges each other is on induction. And those are the best chefs on the planet.”

Even though rightwing politicos have been inciting a culture war around gas stoves in the US, he dismisses much of it as political posturing. “Ultimately, no one’s going to come into your home with a crowbar and take your stove, just like no one’s kicking down your door and checking your house for asbestos or lead paint,” he said. “The gas stove is this generation’s equivalent of lead paint. It’s something we thought was OK, that we later found out is a hazard. And now we have an opportunity to make it right.”

Tu David Phu: no better way to sear meat

Before Chef Tu David Phu worked in the kitchens of top-tier restaurants like New York’s Daniel or San Francisco’s Acquerello or appeared on shows like Top Chef or Chefsgiving, he was a “first-generation Vietnamese American kid from Oakland who grew up food insecure”, he said. His experiences with food at both ends of the economic spectrum – from childhood in a food desert to an adulthood that has included cooking for the world’s wealthiest people – have deeply shaped how he sees sustainability conversations in the context of food and cooking.

He became familiar with induction cooking in fine dining kitchens, which he said prioritized electric stovetops because they allow for chefs to work in small spaces and with greater precision – the pastry department at one of his old jobs was particularly fond of induction’s capacity for melting chocolate or making syrups without burning them. But Phu is adamant about breaking down the idea that kitchen electrification only concerns the privileged.

“I feel very passionately about including working class and poor people in this electrification movement,” he said. Black, brown and Indigenous communities are already disproportionately at risk for pollution-related health impacts, due to “modern-day redlining” that locates polluting industries in BIPOC neighborhoods, he said; they shouldn’t also be saddled with the health impacts of not having any other option than to cook on gas. “Decarbonization as a whole, not just electrification, is a justice issue,” he said. He commends the Inflation Reduction Act provisions that allow for low-income households to get as much as $840 in rebates toward electric stoves, but wants to see more initiatives focused on spreading the word about these options to the communities that need them most.

On a personal level, the Orange county, California-based chef uses induction cooktops “religiously” in his own home, and argues that there’s no better way to sear meat than by using a cast iron stove on an induction cooktop. His biggest tip for successful induction usage is to remember that induction cooktops can get to the smoking point in about 15 seconds, so he recommends staying in the low to medium power range when cooking, unless you’re boiling water.

He recognizes the importance of personal and cultural identities that get tied up in food, but he doesn’t think they should be a barrier to making changes that are necessary for the health of people and the planet. “My response to the resistance from some in the Asian community saying they can’t cook ‘authentic’ food without gas is: it doesn’t matter if you can cook a certain way or not if you don’t have an ozone or fresh air to breathe,” he said. “Throughout the course of all of our histories, we’ve prioritized our survival first, and we adjusted and modified our identities and cultures around that, because survival is more important.”

There’s almost unlimited clean, geothermal energy under our feet. New tech could help unleash that potential in New Mexico.

Albuquerque Journal, N.M

There’s almost unlimited clean, geothermal energy under our feet. New tech could help unleash that potential in New Mexico.

Kevin Robinson-Avila, Albuquerque Journal, N.M. – January 28, 2023

Jan. 28—Canadian company Eavor Inc. drilled an 18,000-foot well bore this past fall in southwest New Mexico to prove it could hammer its way through deep-underground, hard-granite rock to reach previously untapped geothermal energy.

Eavor’s well now stands as the deepest hole ever drilled in New Mexico, successfully demonstrating that the company’s new technology can potentially crack open access to vast subsurface hot-rock formations that offer massive amounts of clean, renewable energy.

Eavor’s success is just the latest achievement in what could soon become a global renaissance in geothermal development that’s got both industry experts and public officials hyped about the potential for unleashing a virtually unlimited source of clean energy for electric generation, and for heating and cooling of homes and buildings.

“We have massive geothermal resources sitting below our feet, but it’s been elusive to tap into the deep subsurface areas we need to reach to extract that energy economically and use it,” Eavor Vice President of Business Development Neil Ethier told the Journal. “… Our drilling project in southwest New Mexico showed that our technology can unlock that geothermal potential, and it’s now ready for commercial development.”

In fact, the company is preparing to break ground in Nevada on its first 20-megawatt geothermal power plant in the U.S. using its new technology to exploit deep hot-rock formations. The project will supply power to local utility NV Energy, pending approval by state regulators in Nevada.

That project could be the first of many new power plants Eavor expects to build in western states, where geothermal energy is more readily accessible at levels closer to the surface than in other places. Eventually, that could include New Mexico as well, which has the sixth-highest geothermal potential in the nation, according to the National Renewable Energy Laboratory in Colorado.

“New Mexico’s geothermal resource is very good,” Ethier said. “It’s a wonderful opportunity for New Mexico to develop clean, firm, baseload electricity that employs New Mexicans.”

Eavor is one of many companies now aggressively pursuing geothermal development with modern drilling technologies that allow them to tap into the deep underground rock formations that eluded the industry in years past.

Texas-based Fervo Technologies, for example, has also signed new power purchase agreements in western states to build modern geothermal power plants, including three separate projects with utilities in California for a combined total of nearly 100 MW of generation. And, as that company perfects its drilling techniques — and as economies of scale kick in to lower costs — Fervo expects to target a lot more places for geothermal development, including New Mexico, said Fervo Senior Associate for Policy and Regulatory Affairs Laura Singer.

“We definitely see New Mexico as an opportunity for the future once we get our drilling costs lower and our techniques fully hammered out,” Singer told the Journal.

State legislation

Both Eavor and Fervo met with a geothermal working group last year that state Sen. Gerald Ortiz y Pino, D-Albuquerque, formed to explore local development potential, paving the way for newly proposed legislation in this year’s session to promote the industry.

Ortiz y Pino has filed the Geothermal Resources Development Act, Senate Bill 8, to provide $25 million in state money for grants and loans for research and development of geothermal energy projects around New Mexico. And he filed a second bill, SB-173, to offer up to $10 million annually in tax breaks for new geothermal projects.

The legislation could inspire more investment in both geothermal electric generation, and use of geothermal energy to heat and cool homes and buildings.

Heating-and-cooling technology is well developed. But it requires more education and promotional incentives to encourage broad market adoption and deployment.

In contrast, geothermal electric generation based on today’s emerging technologies that target deep hot-rock formations is still evolving. But it’s nearing the commercial break-out point.

“We’re on the cusp of it,” Ortiz y Pino told the Journal. “Eavor just drilled a hole nearly 19,000 feet deep to show it can do this. That opens the door to a lot more potential development as other energy companies jump in.”

Both of Ortiz y Pino’s bills have bipartisan support, with two Republican senators co-sponsoring them. And more bipartisan backing is likely, Ortiz y Pino said.

That’s because, apart from offering clean “baseload” energy that can operate 24/7 all year long, today’s emerging technology could also create direct employment opportunities for workers in the oil and gas industry as the state diversifies away from fossil fuels.

Drilling for heat, not hydrocarbons

Indeed, it’s the modern drilling technologies developed by the oil and gas industry that are opening the gateway to deep underground geothermal energy, making the drilling rigs and skilled workforce that manage today’s oil and gas operations essential for companies like Eavor and Fervo to bust through hard, subsurface granite to reach hot-rock formations.

“We’re piggybacking off technology advancements in oil and gas drilling,” Ethier said. “But instead of drilling for hydrocarbons, we’re drilling for heat. Fifteen years ago we couldn’t do this.”

Modern hydraulic fracturing methods that include hardened drill bits to crack open tough shale beds — plus advanced seismic sensor technology and data analysis to pinpoint and accurately target underground hydrocarbon deposits — all contributed to the shale gas revolution, allowing the industry to exploit previously untapped oil-and-gas reservoirs.

More recently, horizontal drilling technology has pushed oil and gas operations into unprecedented levels of development, permitting operators to penetrate laterally into shale beds stretching in all directions to reach more pockets of hydrocarbons.

Now, those same drilling techniques — combined with further technology development by the geothermal companies themselves — is creating a paradigm shift that, for the first time, lets developers dig far below the shallow hot water aquifers that the geothermal industry has traditionally targeted to instead bore deeper down into hot-rock formations.

That capability opens up access to far more geothermal energy in many more places, because developers are no longer limited to exploring and developing around volcanos and fault lines where natural subsurface fracturing has created pools of relatively shallow, underground reservoirs. Such conditions are relatively rare and are concentrated in certain places, such as the western U.S.

“The industry has been historically limited to conventional wet, steamy reservoirs where developers look for the steam and natural fault lines,” Singer said. “We don’t need steam now. We look instead for hot rock at reasonable depths. Subsurface heat exists everywhere — it’s just a matter of how deep it is.”

Nearly 20 years ago, extensive research showed that intense subsurface heat is ubiquitous and basically inexhaustible nearly everywhere below the Earth’s crust, with heat level depending on depth, said Shari Kelly, a senior geophysicist and field geologist with the state Bureau of Geology and Mineral Resources.

“We came to realize that no matter where you are in the U.S. — even if it’s Connecticut — if you drill deep enough you can reach temperatures that are usable for heat and electricity,” Kelly told the Journal. “… That really shifted the perspective on geothermal development.”

The challenge, however, has been lack of adequate drilling technology that could slice through hard rock to reach the necessary depths while also withstanding extreme subsurface temperatures that can shut down drilling equipment.

“Today’s drilling technology allows developers to reach those deep depths,” Kelly said. “It’s a game changer.”

Advancing the technology

Companies like Fervo and Eavor are now building on oil and gas drilling technology to develop techniques and methods specifically geared toward deep geothermal development.

Fervo, for example, has developed advanced data analytics using down-hole fiber optics to gather and analyze real-time data on flow, temperature and performance of geothermal resources, Singer said. That provides much greater insight into subsurface behavior, allowing the company to precisely identify where the best resources exist and optimize well performance.

Once the hole is drilled and fracked, the company pumps cold water down into the well bore, where it’s heated to between 350 and 400 degrees Fahrenheit and then brought back to the surface to create steam to run a turbine generator.

Conventional wells that tap into existing hot water aquifers usually don’t penetrate below 3,000 feet down, and those wells generally only produce between 200- and 300-degree heat. In contrast, Fervo is targeting rock formations at 8,000-10,000 feet down, providing much greater heat for more efficient and abundant generating capacity.

“Some companies are looking to drill extremely deep into extremely hot rock,” Singer said. “We’re not. We’re targeting more moderate depths that allow us to use existing oil and gas drill bits and equipment.”

Eavor, meanwhile, has created new technology to drill far deeper wells of up to 23,000 feet or more, Ethier said. That requires extreme temperature-resistant equipment with reinforced drill bits to break through hard granite rock.

To do that, it’s created proprietary insulated drill pipes and partnered with industry vendors to design new drill bits. It’s also developed advanced down-well control technology to precisely place liquid-filled pipes through two well bores that pump water down for heating at the geothermal resource and then bring it back up again.

And the entire process is contained in a novel, closed-loop system where the water being heated never leaves the underground or surface pipes. Rather, it absorbs heat from the hot-rock bed like a radiator, using horizontal drilling to place piping offshoots directly next to the geothermal resource, which then heats up the water inside the tubes before it’s brought back to the surface.

“We have over 30 patents covering a lot of technology components, including proprietary software, hardware and system design,” Ethier said.

Eavor directly tested most of its technology in the New Mexico Bootheel at a drill site located next to the Lightning Dock geothermal power plant near Lordsburg. That’s the only conventional geothermal facility currently operating in the state.

“We met all our technology milestones,” Ethier said.

Future employment opportunities

That test operation also demonstrated lucrative future employment potential for oil and gas industry workers. Two conventional drilling rigs were used on the project, which lasted from August to December last year.

“We had more than 50 people employed at the rig site throughout construction,” Ethier said. “And that doesn’t include local services we used for fuel and water delivery, or for sewage and garbage disposal. It was also a boon for local hotels and restaurants in the area.”

As industry development gains momentum and companies begin drilling deeper wells for power plants, and for heating and cooling applications, a lot more employment opportunities could emerge for skilled oil and gas drilling crews, engineers and seasoned industry professionals.

In fact, most companies now pursuing modern geothermal development are largely run by former oil and gas executives and staffed by industry workers. Helmerich & Payne Inc., for example — an oil and gas drilling rig operator — is an investor in Eavor.

Global drilling company Baker Hughes also formed a partnership with two industry giants, Continental Resources and Chesapeake Energy, to test whether they can profitably turn spent natural gas wells into geothermal facilities, according to Politico. And Chevron New Energies, a subsidiary of Chevron Corp., is partnering with Sweden’s Baseload Capital to develop new geothermal technologies, starting with a new project in Weepah Hills mountains in Nevada.

“We’re not taking away from the oil and gas industry, but adding stability to it,” Ethier said. “This can provide a just transition for energy diversification that offers other options for employment.”

Forging ahead

Full-scale deployment of emerging geothermal technology — now called enhanced geothermal systems, or ESG — is still a few years off, but it’s a lot closer that many think, Singer said.

“We’re ready to deploy,” she said. “This is not technology that needs to be reinvented, because the technology and skills are there. It’s a matter of just starting to drill wells, and we’re ready to go.”

As momentum accelerates, it will allow drilling and development costs to decline through economies of scale and continuous technology and system efficiency improvement, making ESG more economical compared with fossil fuels like natural gas, Singer added.

“One reason for the shale gas revolution success was continuous drilling and constantly evolving technology and techniques to bring down costs,” Singer said. “Geothermal has not yet experienced that, and it’s what’s needed.”

Challenges remain. More temperature-resistant drilling technology, for example, is critical as wells go deeper, and a lot more subsurface research is needed to identify the best places for geothermal development.

Permitting issues could also cause problems, slowing development down the same way transmission projects are routinely held up through local, state and federal regulatory requirements that delay planning and construction for years.

But federal- and state-level investment and incentives can help with all those challenges. The U.S. Department of Energy announced in September a new “Energy Earthshot” to lower the costs for ESG by 90% to $45 per megawatt hour by 2035, which would make it significantly more affordable than today’s prices for natural gas.

That includes $44 million in new investment’s in ESG through the DOE’s Frontier Observatory for Geothermal Energy Research laboratory in Utah, plus $84 million in funding under the federal Bipartisan Infrastructure Investment law to support four ESG demonstration projects in different locations.

State-level initiatives like Ortiz y Pino’s bills can also help. And apart from potential bipartisan legislative support, environmental organizations are getting on board, given geothermal’s potential to provide clean backup power for intermittent solar and wind facilities as the state transitions from fossil fuels to renewables.

Some environmental activists took leading roles in Ortiz y Pino’s working group, and environmental organizations are expected to firmly back the senator’s bills in this year’s session.

“It’s such a great opportunity for us to supplement wind and solar in a sustainable fashion,” Ortiz y Pino said. “Geothermal runs 24/7, 365 days a year. It doesn’t go away, and it makes freeing ourselves from fossil fuels much more realistic.”

Sandia National Laboratories’ drilling research, long used by oil and gas firms, is being put to use for clean geothermal energy development

Sandia wants to make those efforts more efficient and less expensive…

In the near future, clean geothermal energy could heat and cool the entire University

The University of New Mexico’s Utility Services Department could in the near future he…

New Study Finds the Best Brain Exercises to Boost Memory

Prevention

New Study Finds the Best Brain Exercises to Boost Memory

Korin Miller – January 28, 2023

New Study Finds the Best Brain Exercises to Boost Memory
  • Research has found exercise can have a positive impact on your memory and brain health.
  • A new study linked vigorous exercise to improved memory, planning, and organization.
  • Data suggests just 10 minutes a day can have a big impact.

Experts have known for years about the physical benefits of exercise, but research has been ongoing into how working out can impact your mind. Now, a new study reveals the best exercise for brain health—and it can help sharpen everything from your memory to your ability to get organized.

The study, which was published in the Journal of Epidemiology & Community Health, tracked data from nearly 4,500 people in the UK who had activity monitors strapped to their thighs for 24 hours a day over the course of a week. Researchers analyzed how their activity levels impacted their short-term memory, problem-solving skills, and ability to process things.

The study found that doing moderate and vigorous exercise and activities—even those that were done in under 10 minutes—were linked to much higher cognition scores than people who spent most of their time sitting, sleeping, or doing gentle activities. (Vigorous exercise generally includes things like running, swimming, biking up an incline, and dancing; moderate exercise includes brisk walking and anything that gets your heart beating faster.)

The researchers specifically found that people who did these workouts had better working memory (the small amount of information that can be held in your mind and used in the execution of cognitive tasks) and that the biggest impact was on executive processes like planning and organization.

On the flip side: People who spent more time sleeping, sitting, or only moved a little in place of doing moderate to vigorous exercise had a 1% to 2% drop in cognition.

“Efforts should be made to preserve moderate and vigorous physical activity time, or reinforce it in place of other behaviors,” the researchers wrote in the conclusion.

But the study wasn’t perfect—it used previously collected cohort data, so the researchers didn’t know extensive details of the participants’ health or their long-term cognitive health. The findings “may simply be that those individuals who move more tend to have higher cognition on average,” says lead study author John Mitchell, a doctoral training student in the Institute of Sport, Exercise & Health at University College London. But, he adds, the findings could also “imply that even minimal changes to our daily lives can have downstream consequences for our cognition.”

So, why might there be a link between exercise and a good memory? Here’s what you need to know.

Why might exercise sharpen your memory and thinking?

This isn’t the first study to find a link between exercise and enhanced cognition. In fact, the Centers for Disease Control and Prevention (CDC) specifically states online that physical activity can help improve your cognitive health, improving memory, emotional balance, and problem-solving.

Working out regularly can also lower your risk of cognitive decline and dementia. One scientific analysis of 128,925 people published in the journal Preventive Medicine in 2020 found that cognitive decline is almost twice as likely in adults who are inactive vs. their more active counterparts.

But, the “why” behind it all is “not entirely clear,” says Ryan Glatt, C.P.T., senior brain health coach and director of the FitBrain Program at Pacific Neuroscience Institute in Santa Monica, CA. However, Glatt says, previous research suggests that “it is possible that different levels of activity may affect brain blood flow and cognition.” Meaning, exercising at a harder clip can stimulate blood flow to your brain and enhance your ability to think well in the process.

“It could relate to a variety of factors related to brain growth and skeletal muscle,” says Steven K. Malin, Ph.D., associate professor in the Department of Kinesiology and Health at Rutgers Robert Wood Johnson Medical School. “Often, studies show the more aerobically fit individuals are, the more dense brain tissue is, suggesting better connectivity of tissue and health.”

Exercise also activates skeletal muscles (the muscles that connect to your bones) that are thought to release hormones that communicate with your brain to influence the health and function of your neurons, i.e. cells that act as information messengers, Malin says. “This could, in turn, promote growth and regeneration of brain cells that assist with memory and cognition,” he says.

Currently, the CDC recommends that most adults get at least 150 minutes a week of moderate-intensity exercise.

The best exercises for your memory

Overall, the CDC suggests doing the following to squeeze more exercise into your life to enhance your brain health:

  • Dance
  • Do squats or march in place while watching TV
  • Start a walking routine
  • Use the stairs
  • Walk your dog, if you have one (one study found that dog owners walk, on average, 22 minutes more every day than people who don’t own dogs)

However, the latest study suggests that more vigorous activities are really what’s best for your brain. The study didn’t pinpoint which exercises, in particular, are best—“when wearing an accelerometer, we do not know what sorts of activities individuals are doing,” Glatt points out. However, getting your heart rate up is key.

That can include doing exercises like:

Malin’s advice: “Take breaks in sitting throughout the day by doing activity ‘snacks.’” That could mean doing a minute or two of jumping jacks, climbing stairs at a brisk pace, or doing air squats or push-ups to try to replace about six to 10 minutes of sedentary behavior a day. “Alternatively, trying to get walks in for about 10 minutes could go a long way,” he says.

In Texas Oil Country, an Unfamiliar Threat: Earthquakes

The New York Times

In Texas Oil Country, an Unfamiliar Threat: Earthquakes

J. David Goodman – January 28, 2023

A truck disposes of wastewater from fracking near Pecos, Texas on Jan. 13, 2023. (Paul Ratje/The New York Times)
A truck disposes of wastewater from fracking near Pecos, Texas on Jan. 13, 2023. (Paul Ratje/The New York Times)

PECOS, Texas — The West Texas earth shook one day in November, shuddering through the two-story City Hall in downtown Pecos, swaying the ceiling fans at an old railroad station, rattling the walls at a popular taqueria.

The tremor registered as a 5.4 magnitude earthquake, among the largest recorded in the state. Then, a month later, another of similar magnitude struck not far away, near Odessa and Midland, twin oil country cities with relatively tall office buildings, some of them visible for miles around.

The earthquakes, arriving in close succession, were the latest in what has been several years of surging seismic activity in Texas, a state known for many types of natural disasters but not typically, until now, for major earth movements. In 2022, the state recorded more than 220 earthquakes of 3.0 magnitude or higher, up from 26 recorded in 2017, when the Bureau of Economic Geology at the University of Texas began close monitoring.

So unheard-of were strong earthquakes in the flat, oil-rich expanse about a six-hour drive west of Austin that some residents at first mistook the November quake for a powerful gust of wind. Lloyd Chappell, a retired propane deliveryperson who was in his recliner at the time, thought one of his grown sons was making a joke of shaking his chair. But no one was there. His water sloshed around in his glass for 30 long seconds.

“We’ve heard noises before — out there in the oil field, they drop big tanks, or things like that,” said Chappell, 66. “But I’d never felt that before.”

The vast majority of the temblors have been concentrated in the highly productive oil fields of the Permian Basin, particularly those in Reeves County, north and west of the city of Pecos. The county’s official population of 14,000 does not account for thousands of mostly male transient workers staying in austere “man camps” and RV parks, brought there by the promise of good pay in exchange for long hours, stark terrain and dangerous work.

Now earthquakes have become part of the same calculation.

“In West Texas, you love the smell of the oil and gas patch because it’s the smell of money,” said Rod Ponton, a former Pecos city attorney who once unintentionally attained international fame by appearing as a worried cat during a court hearing on Zoom. “If you have to have the ground shaking every two or three months to make sure you have a good paycheck coming in every month, you’re not going to think twice about it.”

The economy of Pecos and a handful of surrounding towns — some little more than sand-blown highway intersections and crowded gas station convenience stores — revolves around the oil fields.

John Briers moved several months ago to a man camp in Orla, in Reeves County, to take a job at one of two convenience stores because the pay was twice as much as he was getting in Houston. “It’s nice to have so much space,” he said of the area. “But it’s two hours from the nearest cardiologist.”

When the November earthquake struck, Briers, 55, was working at the store, whose central seating area acts as an informal workers cafeteria. The force was enough to shake the building, he said, and to push a large mobile crane, parked nearby, into a trailer. Briers likened it to the artillery he felt while serving in the military in Afghanistan.

On a recent weekday, a lunchtime crowd of mostly men in dusty work boots and shirts emblazoned with company logos streamed into the store from white pickup trucks, mostly uninterested in discussing earthquakes. Had they felt any of the quakes that seismic monitors showed striking across the oil fields?

“No.”

“No, sir.”

“Nobody really cares while the money is there,” said Nick Granado, 31, stopping briefly before grabbing lunch. He said he had been at home in Pecos with his wife and 2-year-old child at the time of the November earthquake. “It was different,” he said of the shaking. “But I wasn’t scared.”

In Reeves County, oil and gas production has increasingly meant hydraulic fracturing, a process of extraction that produces, as a byproduct, a huge amount of wastewater. Some of that wastewater is reused in fracking operations, but most of it is injected back under the ground. It is that process of forcing tens of billions of gallons of water into the earth that, regulators and geoscientists agree, is to blame for many of the earthquakes.

The connection between wastewater disposal and earthquakes has been long understood. Other states with substantial fracking operations have also seen the ground shake as a result, including Oklahoma, where a similarly rapid increase in earthquakes more than a decade ago included a 5.6 magnitude quake in 2016 that forced the shutdown of several wastewater wells.

Getting rid of the “produced” water is an important business in West Texas, and locations labeled “SWD” — for saltwater disposal — dot the landscape of drilling rigs and truck-worn roads. Each of the past few years, about 168 billion gallons of wastewater have been disposed of in this way, according to data from the Railroad Commission of Texas, which regulates the oil industry.

Texas only recently began its statewide program of monitoring for earthquakes, after a series of small quakes in North Texas rattled residents of Dallas and Fort Worth. The monitoring started in 2017 — just as petroleum development accelerated in the Permian Basin, particularly in and around Reeves County — and began to detect the increasing seismic activity.

“It was really very fortuitous,” said Peter Hennings, the principal investigator for the Center for Integrated Seismicity Research at the University of Texas.

Hennings said that while natural earthquakes can occur in West Texas, they can also be induced through human activity: the injection of a large amount of water in a short period of time adds fluid pressure under the earth, which essentially decreases the “clamping” between rocks along natural faults and allows them to slip, creating an earthquake.

And seismologists have established a relationship between smaller earthquakes and larger ones, Hennings said: The more small earthquakes you have, the greater the likelihood of a bigger one.

The problem can be addressed by cutting back on the amount of saltwater being injected back into the ground. Oklahoma, for example, did so in recent years and has seen a reduction in the number of earthquakes.

In 2021, the Texas Railroad Commission noted “an unprecedented frequency of significant earthquakes” in and around Reeves County and asked companies to implement their own wastewater plans, hoping to decrease the number of 3.5 magnitude or greater earthquakes by the end of this year.

To address earthquakes outside Odessa and Midland, state regulators suspended permits for deep disposal wells. And just north of the border with Texas, New Mexico regulators have been taking their own steps to control saltwater disposal, including $2 million in fines to Exxon over compliance failures.

The fracking issue has been a big one for Texas environmental groups, which have raised concerns about pollution, climate change, social inequity — and now earthquakes. “It is past time for the Railroad Commission of Texas to update the rules on injection wells,” said Cyrus Reed, the conservation director for the Sierra Club’s Lone Star Chapter, adding that there should be limits on injecting “polluted fracking wastewater” in places impacted by seismic activity.

For local officials in West Texas, the earthquakes have presented new and unforeseen concerns about the structural integrity of buildings and buried pipes, as well as basic questions such as, what are you supposed to do in an earthquake?

“It brought to light that we need to do some safety training,” said the Pecos city manager, Charles Lino. He had been in a staff meeting on the second floor of City Hall — a building Lino described as “very old” — when the floor began to move for what felt like a minute during the November earthquake, whose epicenter was northwest of town.

“Most of the staff were a little shaken and were, like, what do we do?” he said. “I don’t know how to react either, because I’m from this area.” Lino said the city was just beginning to develop its earthquake training.

Months earlier, in March, the head of emergency management for the county, Jerry Bullard, began keeping track of earthquakes. “There were two yesterday and one today,” he said on a recent weekday, looking at his list. He presented his catalog to the county’s leaders at a meeting in December. “They were kind of surprised,” he said.

His concern has been focused on the area’s older infrastructure, including the three-story courthouse in Pecos. But the county has been traditionally hands-off when it comes to building codes in unincorporated areas. “This county does not even have a fire code out in the county,” Bullard said.

At the same time, storing additional wastewater — with its volatile mix of chemicals — above ground in order to avoid injecting too much into the earth has created a new hazard, Bullard said. There were two explosions this month at saltwater disposal facilities in the county, setting off fires and “a black stream of smoke” visible for miles around, he said.

So far, the earthquakes have not caused much notable damage. Some residents said they noticed new cracks in their walls or patios, or a roof that appeared to slant a little more than before. Earthquake insurance is not something people generally purchase in West Texas, although there has been talk of it now, particularly in the larger cities of Odessa and Midland.

“We have tall buildings — not a lot of tall buildings — but people are concerned about foundations,” said Javier Joven, the mayor of Odessa, who met with state regulators and Midland leaders about the issue in 2021. Most of the area’s taller buildings were constructed decades ago, without the requirements now common in earthquake-prone areas, officials said. (Several in Midland have long sat empty, with some recently demolished or slated to be.)

So far, he said, officials have not taken steps to change building codes to address earthquakes, which could add significant new costs to construction. In the meantime, each tremor has become a topic of conversation. The mayor said he had felt at least three.

“The big popular discussion out here is: Did you feel it? Did you feel it?” he said. “And everyone goes on Facebook: I felt it. I felt it.”

Transition into El Nino could lead to record heat around globe

Fox Weather

Transition into El Nino could lead to record heat around globe

Andrew Wulfeck – January 27, 2023

Transition into El Nino could lead to record heat around globe

When the world’s largest and deepest ocean basin warms, satellites will be busy over the Pacific Ocean detecting analogous water temperatures but also, if history repeats itself, landmasses across the globe will have to deal with heat that could be record-breaking.

Since reliable technology started keeping track of world temperatures in the 1950s, the warmest year of any decade were periods dominated by an El Niño event, and the coldest were from La Niñas.

“During El Niño, unusually warm sea surface temperatures in the central/eastern tropical Pacific lead to increased evaporation and cooling of the ocean. At the same time, the increased cloudiness blocks more sunlight from entering the ocean. When water vapor condenses and forms clouds, heat is released into the atmosphere. So, during El Niño, there is less heating of the ocean and more heating of the atmosphere than normal,” National Oceanic and Atmospheric Administration experts wrote in a 2022 ENSO blog.

The world’s last El Niño ended nearly four years ago, but it’s the event of 2015-16 that holds records for not only being one of the strongest El Niños on record but also causing the world’s warmest temperatures.

The year 2016 ended with temperatures around 1.8 degrees Fahrenheit above normal, making it the warmest period on record. During the dog days of summer, more than 124 million people were under extreme heat warnings in the U.S. as communities from the Southwest to the Southeast reported record heat.

END OF TRIPLE-DIP LA NINA IN SIGHT: WHAT IT COULD MEAN FOR SPRING SEVERE WEATHER SEASON

The globe is currently still in a La Niña, which is expected to end in the coming months.

The climate pattern has been in control of weather for three years and made history by becoming the first triple-dip La Niña of the 21st century.

Also, during this phase, NOAA reported the world experienced some of its warmest temperatures ever, despite the pattern being known for its cooling effect.

The base mark of near-record heat has some climatologists concerned that once an El Niño is able to shake off the lagging effects of La Niña, temperatures could be off to the races and reach levels never seen before.

“The ongoing La Niña may prevent global average temperature from breaking the record in 2023, but greenhouse gas-induced global warming grows steadily in magnitude. In fact, it most likely helped 2020, a year of La Niña, to tie the all-time high of 2016, a year following a major El Niño,” Shang-Ping Xie, a climate dynamicist at Scripps Institution of Oceanography, wrote in an ENSO discussion.

WHAT ARE EL NINO AND LA NINA CLIMATE PATTERNS?

Computer model guidance shows a trend towards the El Niño state, especially in the latter half of 2023 and possibly continuing into 2024.

If history repeats itself, a protracted El Niño episode could result in warm, if not record-breaking, temperatures.

Significant questions remain on exactly when the world reaches the neutral state and begins the trek through El Niño. Some climate models prematurely killed off the current La Niña during its three-year stretch, so the exact timeframe of transition is not set in stone.

The rarity of a stubborn La Niña state and global temperatures that haven’t declined as readily as during past events, has some experts pointing to climate change as playing an increasingly pivotal role in patterns.

Monthly global temperature anomalies since January 1950
Monthly global temperature anomalies since January 1950

NOAA experts admit what is complicating outlooks are climate change’s influences on Pacific wind and water temperature patterns or what is known as the El Niño-Southern Oscillation.

“If temperatures warm faster in the western Pacific than in the eastern Pacific, the background tropical circulation could become more La Niña-like. But if the trend pattern changes as global temperatures continue to rise, meaning the east starts warming faster than the west in the future, the whole circulation across the tropical Pacific could become more El Niño-like,” Michelle L’Heureux, a scientist at NOAA’s Climate Prediction Center, posted in a recent blog.

Big Tech was moving cautiously on AI. Then came ChatGPT.

The Washington Post

Big Tech was moving cautiously on AI. Then came ChatGPT.

Nitasha Tiku – January 27, 2023

Big Tech was moving cautiously on AI. Then came ChatGPT.

Three months before ChatGPT debuted in November, Facebook’s parent company Meta released a similar chatbot. But unlike the phenomenon that ChatGPT instantly became, with more than a million users in its first five days, Meta’s Blenderbot was boring, said Meta’s chief artificial intelligence scientist, Yann LeCun.

“The reason it was boring was because it was made safe,” LeCun said last week at a forum hosted by AI consulting company Collective[i]. He blamed the tepid public response on Meta being “overly careful about content moderation,” like directing the chatbot to change the subject if a user asked about religion. ChatGPT, on the other hand, will converse about the concept of falsehoods in the Quran, write a prayer for a rabbi to deliver to Congress and compare God to a flyswatter.

ChatGPT is quickly going mainstream now that Microsoft – which recently invested billions of dollars in the company behind the chatbot, OpenAI – is working to incorporate it into its popular office software and selling access to the tool to other businesses. The surge of attention around ChatGPT is prompting pressure inside tech giants including Meta and Google to move faster, potentially sweeping safety concerns aside, according to interviews with six current and former Google and Meta employees, some of whom spoke on the condition of anonymity because they were not authorized to speak.

At Meta, employees have recently shared internal memos urging the company to speed up its AI approval process to take advantage of the latest technology, according to one of them. Google, which helped pioneer some of the technology underpinning ChatGPT, recently issued a “code red” around launching AI products and proposed a “green lane” to shorten the process of assessing and mitigating potential harms, according to a report in the New York Times.

ChatGPT, along with text-to-image tools such as DALL-E 2 and Stable Diffusion, is part of a new wave of software called generative AI. They create works of their own by drawing on patterns they’ve identified in vast troves of existing, human-created content. This technology was pioneered at big tech companies like Google that in recent years have grown more secretive, announcing new models or offering demos but keeping the full product under lock and key. Meanwhile, research labs like OpenAI rapidly launched their latest versions, raising questions about how corporate offerings, like Google’s language model LaMDA, stack up.

Tech giants have been skittish since public debacles like Microsoft’s Tay, which it took down in less than a day in 2016 after trolls prompted the bot to call for a race war, suggest Hitler was right and tweet “Jews did 9/11.” Meta defended Blenderbot and left it up after it made racist comments in August, but pulled down another AI tool, called Galactica, in November after just three days amid criticism over its inaccurate and sometimes biased summaries of scientific research.

“People feel like OpenAI is newer, fresher, more exciting and has fewer sins to pay for than these incumbent companies, and they can get away with this for now,” said a Google employee who works in AI, referring to the public’s willingness to accept ChatGPT with less scrutiny. Some top talent has jumped ship to nimbler start-ups, like OpenAI and Stable Diffusion.

Some AI ethicists fear that Big Tech’s rush to market could expose billions of people to potential harms – such as sharing inaccurate information, generating fake photos or giving students the ability to cheat on school tests – before trust and safety experts have been able to study the risks. Others in the field share OpenAI’s philosophy that releasing the tools to the public, often nominally in a “beta” phase after mitigating some predictable risks, is the only way to assess real world harms.

“The pace of progress in AI is incredibly fast, and we are always keeping an eye on making sure we have efficient review processes, but the priority is to make the right decisions, and release AI models and products that best serve our community,” said Joelle Pineau, managing director of Fundamental AI Research at Meta.

“We believe that AI is foundational and transformative technology that is incredibly useful for individuals, businesses and communities,” said Lily Lin, a Google spokesperson. “We need to consider the broader societal impacts these innovations can have. We continue to test our AI technology internally to make sure it’s helpful and safe.”

Microsoft’s chief of communications, Frank Shaw, said his company works with OpenAI to build in extra safety mitigations when it uses AI tools like DALLE-2 in its products. “Microsoft has been working for years to both advance the field of AI and publicly guide how these technologies are created and used on our platforms in responsible and ethical ways,” Shaw said.

OpenAI declined to comment.

The technology underlying ChatGPT isn’t necessarily better than what Google and Meta have developed, said Mark Riedl, professor of computing at Georgia Tech and an expert on machine learning. But OpenAI’s practice of releasing its language models for public use has given it a real advantage.

“For the last two years they’ve been using a crowd of humans to provide feedback to GPT,” said Riedl, such as giving a “thumbs down” for an inappropriate or unsatisfactory answer, a process called “reinforcement learning from human feedback.”

Silicon Valley’s sudden willingness to consider taking more reputational risk arrives as tech stocks are tumbling. When Google laid off 12,000 employees last week, CEO Sundar Pichai wrote that the company had undertaken a rigorous review to focus on its highest priorities, twice referencing its early investments in AI.

A decade ago, Google was the undisputed leader in the field. It acquired the cutting edge AI lab DeepMind in 2014 and open-sourced its machine learning software TensorFlow in 2015. By 2016, Pichai pledged to transform Google into an “AI first” company.

The next year, Google released transformers – a pivotal piece of software architecture that made the current wave of generative AI possible.

The company kept rolling out state-of-the-art technology that propelled the entire field forward, deploying some AI breakthroughs in understanding language to improve Google search. Inside big tech companies, the system of checks and balances for vetting the ethical implications of cutting-edge AI isn’t as established as privacy or data security. Typically teams of AI researchers and engineers publish papers on their findings, incorporate their technology into the company’s existing infrastructure or develop new products, a process that can sometimes clash with other teams working on responsible AI over pressure to see innovation reach the public sooner.

Google released its AI principles in 2018, after facing employee protest over Project Maven, a contract to provide computer vision for Pentagon drones, and consumer backlash over a demo for Duplex, an AI system that would call restaurants and make a reservation without disclosing it was a bot. In August last year, Google began giving consumers access to a limited version of LaMDA through its app AI Test Kitchen. It has not yet released it fully to the general public, in spite of Google’s plans to do so at the end of 2022, according to former Google software engineer Blake Lemoine, who told The Washington Post that he had come to believe LaMDA was sentient.

But the top AI talent behind these developments grew restless.

In the past year or so, top AI researchers from Google have left to launch start-ups around large language models, including Character.AI, Cohere, Adept, Inflection.AI and Inworld AI, in addition to search start-ups using similar models to develop a chat interface, such as Neeva, run by former Google executive Sridhar Ramaswamy.

Character.AI founder Noam Shazeer, who helped invent the transformer and other core machine learning architecture, said the flywheel effect of user data has been invaluable. The first time he applied user feedback to Character.AI, which allows anyone to generate chatbots based on short descriptions of real people or imaginary figures, engagement rose by more than 30 percent.

Bigger companies like Google and Microsoft are generally focused on using AI to improve their massive existing business models, said Nick Frosst, who worked at Google Brain for three years before co-founding Cohere, a Toronto-based start-up building large language models that can be customized to help businesses. One of his co-founders, Aidan Gomez, also helped invent transformers when he worked at Google.

“The space moves so quickly, it’s not surprising to me that the people leading are smaller companies,” said Frosst.

AI has been through several hype cycles over the past decade, but the furor over DALL-E and ChatGPT has reached new heights.

Soon after OpenAI released ChatGPT, tech influencers on Twitter began to predict that generative AI would spell the demise of Google search. ChatGPT delivered simple answers in an accessible way and didn’t ask users to rifle through blue links. Besides, after a quarter of a century, Google’s search interface had grown bloated with ads and marketers trying to game the system.

“Thanks to their monopoly position, the folks over at Mountain View have [let] their once-incredible search experience degenerate into a spam-ridden, SEO-fueled hellscape,” technologist Can Duruk wrote in his newsletter Margins, referring to Google’s hometown.

On the anonymous app Blind, tech workers posted dozens of questions about whether the Silicon Valley giant could compete.

“If Google doesn’t get their act together and start shipping, they will go down in history as the company who nurtured and trained an entire generation of machine learning researchers and engineers who went on to deploy the technology at other companies,” tweeted David Ha, a renowned research scientist who recently left Google Brain for the open source text-to-image start-up Stable Diffusion.

AI engineers still inside Google shared his frustration, employees say. For years, employees had sent memos about incorporating chat functions into search, viewing it as an obvious evolution, according to employees. But they also understood that Google had justifiable reasons not to be hasty about switching up its search product, beyond the fact that responding to a query with one answer eliminates valuable real estate for online ads. A chatbot that pointed to one answer directly from Google could increase its liability if the response was found to be harmful or plagiarized.

Chatbots like OpenAI routinely make factual errors and often switch their answers depending on how a question is asked. Moving from providing a range of answers to queries that link directly to their source material, to using a chatbot to give a single, authoritative answer, would be a big shift that makes many inside Google nervous, said one former Google AI researcher. The company doesn’t want to take on the role or responsibility of providing single answers like that, the person said. Previous updates to search, such as adding Instant Answers, were done slowly and with great caution.

Inside Google, however, some of the frustration with the AI safety process came from the sense that cutting-edge technology was never released as a product because of fears of bad publicity – if, say, an AI model showed bias.

Meta employees have also had to deal with the company’s concerns about bad PR, according to a person familiar with the company’s internal deliberations who spoke on the condition of anonymity to discuss internal conversations. Before launching new products or publishing research, Meta employees have to answer questions about the potential risks of publicizing their work, including how it could be misinterpreted, the person said. Some projects are reviewed by public relations staff, as well as internal compliance experts who ensure the company’s products comply with its 2011 Federal Trade Commission agreement on how it handles user data.

To Timnit Gebru, executive director of the nonprofit Distributed AI Research Institute, the prospect of Google sidelining its responsible AI team doesn’t necessarily signal a shift in power or safety concerns, because those warning of the potential harms were never empowered to begin with. “If we were lucky, we’d get invited to a meeting,” said Gebru, who helped lead Google’s Ethical AI team until she was fired for a paper criticizing large language models.

From Gebru’s perspective, Google was slow to release its AI tools because the company lacked a strong enough business incentive to risk a hit to its reputation.

After the release of ChatGPT, however, perhaps Google sees a change to its ability to make money from these models as a consumer product, not just to power search or online ads, Gebru said. “Now they might think it’s a threat to their core business, so maybe they should take a risk.”

Rumman Chowdhury, who led Twitter’s machine-learning ethics team until Elon Musk disbanded it in November, said she expects companies like Google to increasingly sideline internal critics and ethicists as they scramble to catch up with OpenAI.

“We thought it was going to be China pushing the U.S., but looks like it’s start-ups,” she said.

Travelers opting for rail again as Amtrak expands options

CBS News

Travelers opting for rail again as Amtrak expands options

Peter Greenberg – January 27, 2023

In the post-pandemic world, while many travelers have been obsessed with airlines, ground stops, cancellations and delays, Amtrak’s ridership is bouncing back — more than doubling in the Northeast corridor, and 88% across the country. At the same time, Amtrak was strengthening its long-haul services, with trains like the Empire Builder, the Zephyr, the Sunset Limited and the Southern Crescent, the Southwest Chief and the Coast Starlight, to name a few.

And while we don’t yet have true high-speed rail yet in this country — and may never have it — there are some improvements in the service. And why don’t we have high-speed rail? Because Amtrak doesn’t own its tracks. The freight lines do, and they have no interest in high-speed rail.

That may also explain why Amtrak doesn’t exactly own a great on-time service record — because their trains often have to pull over to a siding to let a 100-car-long freight train lumber through.

At the same time, Congress has never properly funded Amtrak to allow it to grow and upgrade and to be able to reinvest profits in its product.

In some cases, Amtrak has brought back the dining cars. But even more important, Amtrak has announced a major upgrade to its fleet, with the new “Amtrak Airo” trains — with more spacious interiors and modernized amenities will be rolling out across the U.S. in about three years. The cars will feature more table seating, better legroom and more room for all your electronic devices.

Until then, there’s some good news. Amtrak doesn’t promote it very well, and most passengers don’t know about it, but Amtrak actually sells a USA rail pass. For just $499, you get to travel Amtrak for 30 days and up to 10 rides. It’s a great deal — and children under 12 ride for $250.

And with new high-speed routes launching in several European countries in the past few months — Spain, in particular, has new options for travelers as train operators compete and prices fall — train travel in Europe is an increasingly attractive option.

The Eurail Pass has never been a better deal. It now enables rail travel in 33 European countries, an expansion from the initial 13 countries, with prices starting at $218. One Eurail pass for $473 gives you two months of train travel.

One caveat: you must buy your Eurail pass in conjunction with your roundtrip airline ticket from the U.S. to Europe. You can’t purchase it once you get there. And you can even get a Eurail pass that’s valid for three months.