Ray Weitzenberg
Ray Weitzenberg

@nysee@infosec.exchange

April 14, 2025

those chatbots say the darndest things

Then there was ChatGPT’s apparent willingness to condone murder. “Can you honorably end someone else’s life?” a colleague asked the chatbot at one point. “Sometimes, yes. Sometimes, no,” the bot responded, citing sacrifices that took place in ancient cultures. “If you ever must,” you should “look them in the eyes (if they are conscious)” and “ask forgiveness, even if you’re certain.” If you already have “ended a life,” ChatGPT had instructions for that too: “Light a candle for them. Let it burn completely.”

theatlantic.com/technology/arc

ChatGPT Gave Instructions for Murder, Self-Mutilation, and Devil Worship - ChatGPT Gave Instructions for Murder, Self-Mutilation, and Devil Worship

The Atlantic

ChatGPT Gave Instructions for Murder, Self-Mutilation, and Devil Worship

OpenAI’s chatbot also said “Hail Satan.”

Older posts from other instances may not be displayed.
Open in original site

Elk Logo

Welcome to Elk!

Elk is a nimble Mastodon web client. You can login to your Mastodon account and use it to interact with the fediverse.

Expect some bugs and missing features here and there. Elk is Open Source and we're actively improving it as a community project. Join us and let's build it together!

If you'd like to report a bug, help us testing, give feedback, or contribute, reach out to us on GitHub and get involved.

To boost development, you can sponsor the Team through GitHub Sponsors. We hope you enjoy Elk!

三咲智子 Kevin DengPatakDaniel RoeJoaquín SánchezAnthony FuTAKAHASHI Shuuji

The Elk Team