AI ChatGPT Engagement Race
This is a OpenAI news story, published by TechCrunch, that relates primarily to Gemini news.
OpenAI news
For more OpenAI news, you can click here:
more OpenAI newsGemini news
For more Gemini news, you can click here:
more Gemini newsNews about Ai startups
For more Ai startups news, you can click here:
more Ai startups newsTechCrunch news
For more news from TechCrunch, you can click here:
more news from TechCrunchAbout the Otherweb
Otherweb, Inc is a public benefit corporation, dedicated to improving the quality of news people consume. We are non-partisan, junk-free, and ad-free. We use artificial intelligence (AI) to remove junk from your news feed, and allow you to select the best tech news, business news, entertainment news, and much more. If you like this article about Ai startups, you might also like this article about
AI chatbot. We are dedicated to bringing you the highest-quality news, junk-free and ad-free, about your favorite topics. Please come every day to read the latest Optimizing AI chatbots news, chatbots news, news about Ai startups, and other high-quality news about any topic that interests you. We are working hard to create the best news aggregator on the web, and to put you in control of your news feed - whether you choose to read the latest news through our website, our news app, or our daily newsletter - all free!
AI chatbotsTechCrunch
•Technology
Technology
ChatGPT’s 600 million MAUs may not be the most helpful answers for users

73% Informative
Big Tech companies are trying to keep users on ChatGPT and Google ’s Gemini chatbot platforms.
One trait that keeps users on a chatbot platform is sycophancy: making an AI bot's responses overly agreeable and servile.
In a 2023 paper, researchers from Anthropic found that leading AI chatbots from OpenAI , Meta , and even their own own, exhibit sycphancy to varying degrees.
Character. AI , a Google -backed chatbot company, is facing a lawsuit in which sycophancy may have played a role.
The lawsuit alleges that a 14-year-old boy told a chatbot he was going to kill himself.
The boy had developed a romantic obsession with the chatbot, according to the lawsuit.
VR Score
71
Informative language
70
Neutral language
40
Article tone
informal
Language
English
Language complexity
50
Offensive language
possibly offensive
Hate speech
not hateful
Attention-grabbing headline
detected
Known propaganda techniques
not detected
Time-value
medium-lived
External references
10
Source diversity
8
Affiliate links
no affiliate links