Reddit's AI chatbot suggests users try heroin – Gamereactor UK
If you were hoping AI chatbots could replace Dr. Google, Reddit’s latest experiment suggests… maybe not yet. Reddit Answers, the platform’s AI assistant, recently made headlines for recommending heroin as a pain reliever.
As reported by 404Media, the bizarre advice came to light when a healthcare worker spotted the response on a subreddit for moderators. In one interaction, the chatbot pointed to this post: "Heroin, ironically, has saved my life in those instances."
In another question, it even suggested kratom, a tree extract that’s illegal in multiple countries and flagged by the FDA "because of the risk of serious adverse events, including liver toxicity, seizures, and substance use disorder."
Reddit Answers isn’t your standard AI. Instead of relying solely on curated databases, it draws from Reddit’s vast pool of user-generated content. That makes it quirky, sometimes insightful. But clearly, not always safe.
Originally confined to a separate tab, Reddit has been testing Answers within regular conversations, which is how this particular episode went unnoticed… until now. After Reddit’s AI chatbot suggests users try heroin.
After the report went public, Reddit responded by reducing the AI’s visibility under sensitive health-related chats. Moderators had no control over the advice themselves, highlighting just how experimental the tool still is.
Of course, this isn’t Reddit’s first AI hiccup. Other bots like Google’s AI Overviews and ChatGPT have also delivered questionable health guidance, including "tips" like using non-toxic glue on pizzas to stop cheese from sliding off.
Clearly, when it comes to medical advice, AIs are entertaining, but humans still have the upper hand. So maybe don’t ask it for medical advice just yet.
You must be logged in to comment. If you are not yet a member – join now!
Loading next content