logo
welcome
Sky News

Sky News

Mother says son killed himself because of 'hypersexualised' and 'frighteningly realistic' AI chatbot in new lawsuit

Sky News
Summary
Nutrition label

68% Informative

Sewell Setzer III, 14 , killed himself after becoming obsessed with artificial intelligence chatbots.

Mother of 14-year-old boy sues Character . AI for "anthropomorphic, hypersexualized, and frighteningly realistic" experiences.

Sewell became obsessed with the bots to the point his schoolwork slipped and his phone was confiscated.

"A dangerous AI chatbot app marketed to children abused and preyed on my son," said Megan Garcia .

VR Score

62

Informative language

56

Neutral language

66

Article tone

informal

Language

English

Language complexity

50

Offensive language

possibly offensive

Hate speech

not hateful

Attention-grabbing headline

not detected

Known propaganda techniques

not detected

Time-value

short-lived

Source diversity

1

Affiliate links

no affiliate links