Site icon Hot Paths

Protesters Accuse Google of Breaking Its Promises on AI Safety

A full-blown courtroom drama — complete with a gavel-wielding judge and an attentive jury, played out in London’s King’s Cross on Monday, mere steps away from Google DeepMind’s headquarters.

Google was on trial for allegations of breaking its promises on AI safety.

The participants of this faux-production were protesters from PauseAI, an activist group concerned that tech companies are racing into AI with little regard for safety. On Monday, the group congregated near King’s Cross station to demand that Google be more transparent about the safety checks it’s running on its most cutting-edge AI models.


Protesters pose outside Google DeepMind’s office.

Hugh Langley/Business Insider



PauseAI argues that Google broke a promise it made during the 2024 AI Safety Summit in Seoul, Korea, when the company agreed to consider external evaluations of its models and publish details about how external parties, including governments, were involved in assessing the risks.

When Google launched Gemini 2.5 Pro, its latest frontier model, in April, it did neither of those things. The company said it was because the model was still “experimental.” A few weeks later, it released a “model card” with some safety details, which some experts criticized for being too thin on details, TechCrunch previously reported. While the safety report made reference to third-party testers, it did not specify who they were.

For PauseAI, this isn’t good enough. More importantly, the organization said, it’s about not letting any lapse slip by and allowing Google to set a precedent.

“If we let Google get away with breaking their word, it sends a signal to all other labs that safety promises aren’t important and commitments to the public don’t need to be kept,” said PauseAI organizing director Ella Hughes, addressing the crowd, which had gradually swelled to around 60 people.

“Right now, AI companies are less regulated than sandwich shops.”


Protesters demonstrate at London’s King’s Cross.

Hugh Langley/Business Insider



There’s a lot to worry about when it comes to AI. Economic disruption. Job displacement. Misinformation. Deepfakes. The annihilation of humanity as we know it.

Focusing on the specific issue of the Google safety report is a way for PauseAI to push for a specific and attainable near-term change.

About 30 minutes into the protest, several intrigued passers-by had joined the cause. After a rousing speech from Hughes, the group proceeded to Google DeepMind’s offices, where the fake courtroom production played out. Some Google employees leaving for the day looked bemused as chants of “Stop the race, it’s unsafe” and “Test, don’t guess” rang out.

“AI regulation on an international level is in a very bad place,” PauseAI founder Joep Meindertsma told Business Insider, pointing to how US Vice President JD Vance warned against over-regulating AI at the AI Action Summit.


A fake courtroom trial takes place outside Google DeepMind’s office.

Hugh Langley/Business Insider



Monday was the first time PauseAI had gathered over this specific issue, and it’s not clear what comes next. The group is engaging with members of UK parliament who will run these concerns up the flagpole, but Meindertsma is reticent to say much about how Google is engaging with the group and their demands (a Google spokesperson did not respond to a request for comment for this story).

Meindertsma hopes support will grow and references polls that suggest the public at large is concerned that AI is moving too fast. The group on Monday was made up of people from different backgrounds, including some who work in tech. Meindertsma himself runs a software development company and regularly uses AI tools from Google, OpenAI, and others.

“Their tools are incredibly impressive,” he said, “which is the thing that worries me so much.”

Have something to share? Contact this reporter via email at hlangley@businessinsider.com or Signal at 628-228-1836. Use a personal email address and a nonwork device; here’s our guide to sharing information securely.

Exit mobile version