Close Menu
    What's Hot

    Sam Altman Predicts AI Will Lead to Jobs Looking ‘Sillier and Sillier’

    June 21, 2025

    Saturday assorted links

    June 21, 2025

    How to make household energy savings and cut bills

    June 21, 2025
    Facebook X (Twitter) Instagram
    Hot Paths
    • Home
    • News
    • Politics
    • Money
    • Personal Finance
    • Business
    • Economy
    • Investing
    • Markets
      • Stocks
      • Futures & Commodities
      • Crypto
      • Forex
    • Technology
    Facebook X (Twitter) Instagram
    Hot Paths
    Home»Business»OpenAI still has a governance problem
    Business

    OpenAI still has a governance problem

    Press RoomBy Press RoomMay 8, 2025No Comments4 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Stay informed with free updates

    Simply sign up to the US companies myFT Digest — delivered directly to your inbox.

    It can be hard to train a chatbot. Last month, OpenAI rolled back an update to ChatGPT because its “default personality” was too sycophantic. (Maybe the company’s training data was taken from transcripts of US President Donald Trump’s cabinet meetings . . .)

    The artificial intelligence company had wanted to make its chatbot more intuitive but its responses to users’ enquiries skewed towards being overly supportive and disingenuous. “Sycophantic interactions can be uncomfortable, unsettling, and cause distress. We fell short and are working on getting it right,” the company said in a blog post.

    Reprogramming sycophantic chatbots may not be the most crucial dilemma facing OpenAI but it chimes with its biggest challenge: creating a trustworthy personality for the company as a whole. This week, OpenAI was forced to roll back its latest planned corporate update designed to turn the company into a for-profit entity. Instead, it will transition to a public benefit corporation, remaining under the control of a non-profit board. 

    That will not resolve the structural tensions at the core of OpenAI. Nor will it satisfy Elon Musk, one of the company’s co-founders, who is pursuing legal action against OpenAI for straying from its original purpose. Does the company accelerate AI product deployment to keep its financial backers happy? Or does it pursue a more deliberative scientific approach to remain true to its humanitarian intentions?

    OpenAI was founded in 2015 as a non-profit research lab dedicated to developing artificial general intelligence for the benefit of humanity. But the company’s mission — as well as the definition of AGI — have since blurred. 

    Sam Altman, OpenAI’s chief executive, quickly realised that the company needed vast amounts of capital to pay for the research talent and computing power required to stay at the forefront of AI research. To that end, OpenAI created a for-profit subsidiary in 2019. Such was the breakout success of chatbot ChatGPT that investors have been happy to throw money at it, valuing OpenAI at $260bn during its latest fundraise. With 500mn weekly users, OpenAI has become an “accidental” consumer internet giant.

    Altman, who was fired and rehired by the non-profit board in 2023, now says that he wants to build a “brain for the world” that might require hundreds of billions, if not trillions, of dollars of further investment. The only trouble with his wild-eyed ambition is — as the tech blogger Ed Zitron rants about in increasingly salty terms — OpenAI has yet to develop a viable business model. Last year, the company spent $9bn and lost $5bn. Is its financial valuation based on a hallucination? There will be mounting pressure on OpenAI from investors rapidly to commercialise its technology.

    Moreover, the definition of AGI keeps shifting. Traditionally, it has referred to the point at which machines surpass humans across a wide range of cognitive tasks. But in a recent interview with Stratechery’s Ben Thompson, Altman acknowledged that the term had been “almost completely devalued”. He did accept, however, a narrower definition of AGI as an autonomous coding agent that could write software as well as any human.

    On that score, the big AI companies seem to think they are close to AGI. One giveaway is reflected in their own hiring practices. According to Zeki Data, the top 15 US AI companies had been frantically hiring software engineers at a rate of up to 3,000 a month, recruiting a total of 500,000 between 2011 and 2024. But lately their net monthly hiring rate has dropped to zero as these companies anticipate that AI agents can perform many of the same tasks.

    Line chart of Hires per month showing Software engineering hires by top US AI companies

    A recent research paper from Google DeepMind, which also aspires to develop AGI, highlighted four main risks of increasingly autonomous AI models: misuse by bad actors; misalignment when an AI system does unintended things; mistakes which cause unintentional harm; and multi-agent risks when unpredictable interactions between AI systems produce bad outcomes. These are all mind-bending challenges that carry some potentially catastrophic risks and may require some collaborative solutions. The more potent AI models become, the more cautious developers should be in deploying them. 

    How frontier AI companies are governed is therefore not just a matter for corporate boards and investors, but for all of us. OpenAI is still worryingly deficient in that regard, with conflicting impulses. Wrestling with sycophancy is going to be the least of its problems as we get closer to AGI, however you define it.

    john.thornhill@ft.com

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Press Room

    Related Posts

    How to make household energy savings and cut bills

    June 21, 2025

    Nearly half of UK investors turn to social media for financial information

    June 21, 2025

    Is F1 the last hope for originality in summer blockbusters?

    June 21, 2025
    Leave A Reply Cancel Reply

    LATEST NEWS

    Sam Altman Predicts AI Will Lead to Jobs Looking ‘Sillier and Sillier’

    June 21, 2025

    Saturday assorted links

    June 21, 2025

    How to make household energy savings and cut bills

    June 21, 2025

    I Scrapped a 600,000 Sq. Ft. Investment for My Business Due to Tariffs

    June 21, 2025
    POPULAR
    Business

    The Business of Formula One

    May 27, 2023
    Business

    Weddings and divorce: the scourge of investment returns

    May 27, 2023
    Business

    How F1 found a secret fuel to accelerate media rights growth

    May 27, 2023
    Advertisement
    Load WordPress Sites in as fast as 37ms!

    Archives

    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • January 2025
    • December 2024
    • November 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • May 2023

    Categories

    • Business
    • Crypto
    • Economy
    • Forex
    • Futures & Commodities
    • Investing
    • Market Data
    • Money
    • News
    • Personal Finance
    • Politics
    • Stocks
    • Technology

    Your source for the serious news. This demo is crafted specifically to exhibit the use of the theme as a news site. Visit our main page for more demos.

    We're social. Connect with us:

    Facebook X (Twitter) Instagram Pinterest YouTube

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • Home
    • Buy Now
    © 2025 ThemeSphere. Designed by ThemeSphere.

    Type above and press Enter to search. Press Esc to cancel.