AI chatbots have had a long history of hallucinating, and Reddit’s version, called Answers, has now joined the list after it recommended heroin to a user seeking pain relief.
As 404Media reports, the issue was flagged by a healthcare worker on a subreddit for moderators. For chronic pain, the user caught Answers suggesting a post that claimed “Heroin, ironically, has saved my life in those instances.”
In another question regarding pain relief, the user found the chatbot recommending kratom, a tree extract that’s illegal in multiple states.
Read more | PC MAG

