I thought vibe litigation was a high risk activity, but turning to a chatbot for medical advice is an even worse life choice.

Ars Technica
After using ChatGPT, man swaps his salt for sodium bromide—and suffers psychosisLiteral “hallucinations” were the result.
August 13, 2025 at 7:40:41 PM
Web