I thought vibe litigation was a high risk activity, but turning to a chatbot for medical advice is an even worse life choice.

Ars Technica
After using ChatGPT, man swaps his salt for sodium bromide—and suffers psychosisLiteral “hallucinations” were the result.