Image default
CryptoNews

San Francisco protesters converged on OpenAI, Anthropic and xAI offices calling for pause on “frontier AI”

TL;DR

  • Activists protest OpenAI, Anthropic, xAI over frontier AI risks and development pace.
  • Demonstrators demand pause on models outpacing human control, plus governance and transparency.
  • Grievances target Anthropic’s commitments and OpenAI’s defense deal amid policy scrutiny.

Around 200 people took to the streets of San Francisco on Saturday, stopping outside the offices of Anthropic, OpenAI, and xAI to demand a coordinated pause in the development of more powerful artificial intelligence systems.

Participants included researchers, academics, and members of organizations such as the Machine Intelligence Research Institute, PauseAI, QuitGPT, StopAI, and Evitable. The march started at noon outside Anthropic’s offices, then moved to OpenAI and finally to xAI, with speakers addressing the crowd at each stop.

Michael Trazzi, founder of Stop the AI Race and a documentarian, organized the demonstration around a central demand: major AI companies must agree to a coordinated pause in building more capable models and push for treaties with international AI developers to do the same. “If China and the U.S. agreed to stop building more dangerous models, they could focus on making systems better for people — like medical AI,” Trazzi told “Everyone would be better off.

Stop the AI Race’s proposal conditions any pause on other major labs making a credible, verifiable commitment to do likewise. For Trazzi, marching outside the companies’ offices carries practical intent beyond symbolism — he wants employees to take the message to their leadership internally. “We want to talk to them and have them talk to their leadership,” he said, adding that whistleblowers carry real power because “they’re the ones building it.”

The Trump Administration Bets on Winning the Race, Not Stopping It

Saturday’s march arrives amid direct tension with federal policy. Last week, the Trump Administration published its AI framework, presenting it as a national standard and framing it explicitly as a commitment to “winning the AI race.” Government officials and backers of continued AI development argue that slowing research in the United States hands an advantage to competitors abroad.

Trazzi pushed back on that framing. “Even in China or any country in the world, nobody wants systems they cannot control,” he said. “We’re in a race between companies and countries to build as fast as possible, taking shortcuts and cutting corners on safety. There are no winners in a race like that. What we get is a system we cannot control — and that’s why I call it a suicide race.

Trazzi proposed limiting the computing power companies can use to train new models as a practical enforcement mechanism: “If you limit how much compute a company can use to build systems, you’re pretty much limiting the development of new models.

In March 2023, the Future of Life Institute published an open letter calling for a moratorium on advancing the most powerful AI systems following the public launch of ChatGPT. 

Signatories included xAI founder Elon Musk, Apple co-founder Steve Wozniak, and Ripple co-founder Chris Larsen. The letter now carries more than 33,000 signatures. In September, Trazzi staged a week-long hunger strike outside Google DeepMind’s London offices, while Guido Reichstadter held a parallel hunger strike outside Anthropic’s San Francisco headquarters.

Following Saturday’s protest, Trazzi said the group plans to bring demonstrations to other cities where major AI labs operate. Anthropic, OpenAI, and xAI did not respond to requests for comment before publication.

Related posts

Spark Unveils Liquidity Layer to Enhance USDS Multi-Chain Functionality

Guido Battigelli

Saifedean Ammous warns about the collapse of Argentina’s bond “Ponzi” scheme and proposes Bitcoin as an exit

Sophie Bennett

Bitcoin pressures the 90,000 dollars level while traders fear a price drop

Nathan Blake

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More