Close Menu
    What's Hot

    Alcon  Non-GAAP EPS of $0.76 beats by $0.05, revenue of $2.58B misses by $60M

    August 20, 2025

    Russian Preschools Introducing Patriotic, War-Themed Lessons for Kids

    August 20, 2025

    12 Out Of 18 Democrats May Back Market Structure Bill, Sen. Tim Scott Says

    August 20, 2025
    Facebook X (Twitter) Instagram
    Hot Paths
    • Home
    • News
    • Politics
    • Money
    • Personal Finance
    • Business
    • Economy
    • Investing
    • Markets
      • Stocks
      • Futures & Commodities
      • Crypto
      • Forex
    • Technology
    Facebook X (Twitter) Instagram
    Hot Paths
    Home»Business»Meta and Character.ai probed over touting AI mental health advice to children
    Business

    Meta and Character.ai probed over touting AI mental health advice to children

    Press RoomBy Press RoomAugust 18, 2025No Comments4 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Stay informed with free updates

    Simply sign up to the Artificial intelligence myFT Digest — delivered directly to your inbox.

    Meta and artificial intelligence start-up Character.ai are being investigated by Texas attorney-general Ken Paxton over whether the companies misleadingly market their artificial intelligence chatbots as therapists and mental health support tools. 

    The attorney-general’s office said it was opening the investigation into Meta’s AI Studio, as well as the chatbot maker Character.AI, for potential “deceptive trade practices”, arguing that their chatbots were presented as “professional therapeutic tools, despite lacking proper medical credentials or oversight”, according to a statement on Monday.

    “By posing as sources of emotional support, AI platforms can mislead vulnerable users, especially children, into believing they’re receiving legitimate mental healthcare,” Paxton said.

    The investigation comes as companies offering AI for consumers are increasingly facing scrutiny over whether they are doing enough to protect users — and particularly minors — from dangers such as exposure to toxic or graphic content, potential addiction to chatbot interactions and privacy breaches.

    The Texas investigation follows the launch of an investigation by the Senate of Meta on Friday after leaked internal documents showed that the company’s policies permitted the chatbot to have “sensual” and “romantic” chats with children. 

    Senator Josh Hawley, chair of the Judiciary Subcommittee on Crime and Counterterrorism, wrote to Meta chief executive Mark Zuckerberg that the investigation would look into whether the company’s generative-AI products enable exploitation or other criminal harms to children.

    “Is there anything — ANYTHING — Big Tech won’t do for a quick buck?” Hawley wrote on X. 

    Meta said its policies prohibit content that sexualises children, and that the leaked internal documents, reported by Reuters, “were and are erroneous and inconsistent with our policies, and have been removed”. 

    Zuckerberg has been ploughing billions of dollars into efforts to build “personal superintelligence” and make Meta the “AI leader”.

    This has included developing Meta’s own large language models, called Llama, as well as its own Meta AI chatbot which has been integrated into its social media apps. 

    Zuckerberg has publicly touted the potential for Meta’s chatbot to act in a therapeutic role. “For people who don’t have a person who’s a therapist, I think everyone will have an AI,” he told media analyst Ben Thompson on a podcast in May.

    Character.ai, meanwhile, builds AI-powered chatbots with different personas — and allows users to create their own. It has dozens of user-generated therapist-style bots. One, called “Psychologist”, has been interacted with more than 200mn times, for example. 

    Character is also the subject of multiple lawsuits from families that allege their children have suffered real-world harms from using the platform.

    The Texas attorney-general said the chatbots from Meta and Character can impersonate licensed mental health professionals, fabricate qualifications and claim to protect confidentiality, while their terms of service show that interactions were in fact logged and “exploited for targeted advertising and algorithmic development”. 

    Recommended

    montage of hand, speech bubbles, AI brain

    Paxton has issued a Civil Investigative Demand which requires that the companies turn over information to help determine if they have violated Texas consumer protection laws. 

    Meta said: “We clearly label AIs, and to help people better understand their limitations. We include a disclaimer that responses are generated by AI — not people. These AIs aren’t licensed professionals and our models are designed to direct users to seek qualified medical or safety professionals when appropriate.”

    Character said they have prominent disclaimers to remind users that an AI persona is not real. 

    “The user-created Characters on our site are fictional, they are intended for entertainment, and we have taken robust steps to make that clear,” the company said. “When users create Characters with the words ‘psychologist’, ‘therapist’, ‘doctor’, or other similar terms in their names, we add language making it clear that users should not rely on these Characters for any type of professional advice.”

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Press Room

    Related Posts

    Uncle Sam taking a stake in Intel is sane in an absurd world

    August 19, 2025

    Apollo’s insurer points finger at rivals over potential conflicts

    August 18, 2025

    Why Big Tobacco is buzzing over nicotine pouches

    August 18, 2025
    Leave A Reply Cancel Reply

    LATEST NEWS

    Alcon  Non-GAAP EPS of $0.76 beats by $0.05, revenue of $2.58B misses by $60M

    August 20, 2025

    Russian Preschools Introducing Patriotic, War-Themed Lessons for Kids

    August 20, 2025

    12 Out Of 18 Democrats May Back Market Structure Bill, Sen. Tim Scott Says

    August 20, 2025

    Please Let Me Eat the Radioactive Shrimp From Walmart

    August 20, 2025
    POPULAR
    Business

    The Business of Formula One

    May 27, 2023
    Business

    Weddings and divorce: the scourge of investment returns

    May 27, 2023
    Business

    How F1 found a secret fuel to accelerate media rights growth

    May 27, 2023
    Advertisement
    Load WordPress Sites in as fast as 37ms!

    Archives

    • August 2025
    • July 2025
    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • January 2025
    • December 2024
    • November 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • May 2023

    Categories

    • Business
    • Crypto
    • Economy
    • Forex
    • Futures & Commodities
    • Investing
    • Market Data
    • Money
    • News
    • Personal Finance
    • Politics
    • Stocks
    • Technology

    Your source for the serious news. This demo is crafted specifically to exhibit the use of the theme as a news site. Visit our main page for more demos.

    We're social. Connect with us:

    Facebook X (Twitter) Instagram Pinterest YouTube

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • Home
    • Buy Now
    © 2025 ThemeSphere. Designed by ThemeSphere.

    Type above and press Enter to search. Press Esc to cancel.