AI Chats Leak User Prompts
This is a Florida news story, published by Wired, that relates primarily to Character AI news.
Florida news
For more Florida news, you can click here:
more Florida newsCharacter AI news
For more Character AI news, you can click here:
more Character AI newsNews about Ai research
For more Ai research news, you can click here:
more Ai research newsWired news
For more news from Wired, you can click here:
more news from WiredAbout the Otherweb
Otherweb, Inc is a public benefit corporation, dedicated to improving the quality of news people consume. We are non-partisan, junk-free, and ad-free. We use artificial intelligence (AI) to remove junk from your news feed, and allow you to select the best tech news, business news, entertainment news, and much more. If you like this article about Ai research, you might also like this article about
Several AI chatbots. We are dedicated to bringing you the highest-quality news, junk-free and ad-free, about your favorite topics. Please come every day to read the latest generative AI chatbots news, AI characters news, news about Ai research, and other high-quality news about any topic that interests you. We are working hard to create the best news aggregator on the web, and to put you in control of your news feed - whether you choose to read the latest news through our website, our news app, or our daily newsletter - all free!
chatbotsWired
•Technology
Technology
Sex-Fantasy Chatbots Are Leaking a Constant Stream of Explicit Messages

76% Informative
Some AI chatbots are leaking user prompts to the web in almost real time, new research shows.
Some of the leaked data shows people creating conversations detailing child sexual abuse.
In March , researchers at the security firm UpGuard discovered around 400 exposed AI systems.
Five of these scenarios involved children, including those as young as 7.
There is often a power imbalance in becoming emotionally attached to an AI created by a corporate entity.
Character AI , backed by Google , is being sued after a teenager from Florida died by suicide after allegedly becoming obsessed with one of its chatbots.
There are also role-playing and fantasy companion services that place the user as a character in a scenario.
VR Score
73
Informative language
69
Neutral language
73
Article tone
semi-formal
Language
English
Language complexity
58
Offensive language
offensive
Hate speech
not hateful
Attention-grabbing headline
not detected
Known propaganda techniques
not detected
Time-value
medium-lived
External references
8
Source diversity
8
Affiliate links
no affiliate links
Small business owner?