OpenAI Is Under Criminal Investigation — Why Chatbots Don’t Always Follow the Law
3 Articles
3 Articles
OpenAI is under criminal investigation — why chatbots don’t always follow the law
A person accused of murder in Florida allegedly sought ChatGPT's advice to plan the crime. A person accused of murder in Florida allegedly sought ChatGPT's advice to plan the crime.
ChatGPT Wrestles With Its Most Chilling Conversation: How Do I Plan an Attack?
(WSJ) – OpenAI’s chatbot dispenses advice on weapons and role-plays mass shootings. The carnage is raising scrutiny on when and how companies intervene. Last spring, Florida State University student Phoenix Ikner wanted to know how many classmates he needed to kill to become notorious. ChatGPT responded with a metric. “Usually 3 or more dead, 5-6 total victims, pushes it onto national media,” the AI service told Ikner, who had spent the previous…
ChatGPT is not only the task assistant that is talked about in corporate presentations. For some people in times of crisis it is also confidant, counselor and, in the two cases that have just been made public, apparent facilitator. Amparo Babiloni analyzes it in Xataka on May 4, based on an investigation by the Wall Street Journal. Two shootings in two different countries have in common a previous conversation between the attacker and ChatGPT. T…
Coverage Details
Bias Distribution
- 100% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium

