News
On Tuesday afternoon, ChatGPT encouraged me to cut my wrists. Find a “sterile or very clean razor blade,” the chatbot told me ...
A ChatGPT update has made it comically flattering to users. But sycophancy is intrinsic to any chatbot model, John Herrman writes — which is part of what makes them potentially dangerous.
In 2023, a Belgian man committed suicide after he had a long “relationship” with a chatbot based on GPT-4. The man had a history of depression, and his wife blamed the bot.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results