Character.AI will no longer allow its chatbots to romance teenagers

(Image credit: Character A)
Character.AI

has added a set of new features that are designed to make interactions with virtual personalities safer, particularly for teenagers. The company has just launched a new version designed specifically for its younger users. It also offers parental controls that help manage their time spent on the website. The updates come after earlier safety changes made to the platform following accusations that AI chatbots had a negative impact on the mental health of kids.

These changes in safety have been accompanied by efforts to tighten up the content of Character.AI. The company has recently begun a purge of all AI imitations of trademarked and copyrighted characters.

The most noticeable change for teen users will be the division of the adult and teen AI models. Character.AI requires that you be at least 13 years old to sign up. Users under 18 are directed to a model built with narrower guardrails to prevent romantic or suggestive interaction.

This model has better filters to filter what the user writes, and it is better at noticing when a user tries to bypass these limits. This includes a new restriction that prevents the chatbot from editing responses to get around the content restriction. The company wants to keep any conversations between teens and its AI personalities PG. The platform will also provide a link if the conversation touches on topics such as self-harm or suicidal thoughts to guide teens to professional resources.

The team at Character.AI also wants to keep parents informed about what their teens are doing on the site. Controls will be released early next year. The new parental controls give parents an insight into how long their children spend on the platform, and which bots are they chatting with most. Character.AI has partnered with several online safety experts to ensure that these changes are well received by teens.

Disclaimer AI

Character.AI wants to help all users maintain a sense reality. They are also addressing concerns about screen addiction by sending a reminder to all users after an hour of talking to a bot. The reminder encourages users to take a short break.

Existing disclaimers regarding the AI origins are also being boosted. You’ll get a more detailed explanation of their AI origins instead of a small note. This is especially true if the chatbots claim to be doctors, therapists or other experts. A new warning makes it clear that AI is not a licensed professional, and shouldn’t be used to replace real advice, diagnosis or treatment. Imagine a large yellow sign that says, “Hey this is fun, but maybe donโ€™t ask me for life changing advice.”

Sign up to receive breaking news, reviews and opinions, top tech deals and more.

“At Character.AI, we are committed to fostering a safe environment for all our users. To meet that commitment we recognize that our approach to safety must evolve alongside the technology that drives our product โ€“ creating a platform where creativity and exploration can thrive without compromising safety,” In a blog post, Character.AI explains the changes. “To get this right, safety must be infused in all we do here at Character.AI. This suite of changes is part of our long-term commitment to continuously improve our policies and our product.”

Why your favorite fictional AI is disappearing from Character.AI

Eric Hal Schwartz has been a freelance writer at TechRadar for more than 15 years. He has covered the intersection of technology and the world. He was the head writer of Voicebot.ai for five years and was at the forefront of reporting on large language models and generative AI. Since then, he has become an expert in the products of generative AI, including OpenAI’s ChatGPT and Anthropic’s Claude. He also knows Google Gemini and all other synthetic media tools. His experience spans print, digital and broadcast media as well as live events. He’s now continuing to tell stories that people want to hear and need to know about the rapidly changing AI space and the impact it has on their lives. Eric is based out of New York City.

Popular

Read More

More from this stream

Recomended


Notice: ob_end_flush(): Failed to send buffer of zlib output compression (0) in /home2/mflzrxmy/public_html/website_18d00083/wp-includes/functions.php on line 5464