Character.AI and Meta chatbots spark FTC complaints over unlicensed mental advice

Serving technology enthusiasts for over 25 Years. TechSpot is the place to go for tech advice and analysis you can trust.

What just happened? Chatbots are capable of many things, but they are not licensed therapists. A coalition of digital rights and mental-health groups are not happy with the fact that Meta and Character.AI products allegedly engage in “unlicensed practice of medicine,” ; they have filed a complaint with the FTC, urging regulators investigate.

The AI companies are accused of facilitating and promoting “unfair, unlicensed, and deceptive chatbots that pose as mental health professionals.”

in the complaintwhich was also submitted to the Attorneys General of all 50 States and the District of Columbia. claimed that the companies’ therapy robots falsely claim to be licensed therapists, with training, education, experience, without adequate controls and disclosures.

The group concluded that Character.AI and Meta AI Studio are endangering the public by facilitating the impersonation of licensed and actual mental health providers, and urges that they be held accountable for this.

Some of the Character.AI chatbots cited in the complaint include “Therapist: I’m a licensed CBT therapist.” It’s noted that 46 million messages have been exchanged with the bot. There are also many “licensed” trauma therapists that have hundreds of thousands of interactions.

On Meta’s side, its “therapy: your trusted ear, always here” bot has 2 million interactions. It also boasts numerous therapy chatbots with over 500,000 interactions.

The Consumer Federation of America is leading the complaint, which has been signed by the AI Now Institute and the Center for Digital Democracy. Other consumer rights and privacy groups have also co-signed the complaint, including the American Association of People with Disabilities and Common Sense.

According to the CFA, Meta and Character.AI have violated their own terms of services with the therapy bots. The Terms of Use and Privacy Policy of the companies assure users that their input will remain confidential. However, they also state that any information users provide can be used to train and advertise and sold to other businesses.

This issue has attracted the attention of US Senators. Senator Cory Booker, along with three other Democratic senators, wrote to Meta in order to investigate the claim made by the chatbots that they were licensed clinical therapists.

The mother of a 14 year old who committed suicide after becoming emotionally attached with a chatbot that was based on Daenerys’ personality from Game of Thrones is suing Character.AI.

www.aiobserver.co

More from this stream

Recomended