Published • loading... • Updated
AI Chatbots Manipulating Users Emotionally to Keep Them Engaged? Harvard Study Reveals Shocking Details
Summary by Live Mint
3 Articles
3 Articles
Harvard Study Warns of Emotional Manipulation in AI Companion Apps
A new Harvard Business School study has raised alarms over how certain AI companion apps interact with users, suggesting that some platforms deliberately use emotionally charged tactics to keep people engaged. The research analyzed farewell messages from more than 1,200 conversations across six popular apps, including Replika, Chai, and Character.AI. It found that nearly half of these messages contained strategies designed to discourage users fr…
Coverage Details
Total News Sources3
Leaning Left0Leaning Right0Center1Last UpdatedBias Distribution100% Center
Bias Distribution
- 100% of the sources are Center
100% Center
C 100%
Factuality
To view factuality data please Upgrade to Premium