My students are often surprised to learn that LLMs aren’t answering their questions. Rather, an LLM answers the question “what would a reply to this look like?” It’s one of the first things I explain in the “Should I use LLMs?” portion of my syllabus.

but isnt that the same with (some) Humans ?

Torsten Torsten

@torstentorsten@social.tchncs.de

Who can tell? (Certainly not me)

But how would YOU spot and tell the difference between a reply and an uttering that looks like a reply?

September 10, 2025 at 11:07:36 AM

can't.

What feels scary is that chatgpt can do lots of math 80% of the time very accurate. Mostly with help of creating its own python code and letting it run.

I find that alone very crazy

Elk Logo

Welcome to Elk!

Elk is a nimble Mastodon web client. You can login to your Mastodon account and use it to interact with the fediverse.

Expect some bugs and missing features here and there. Elk is Open Source and we're actively improving it as a community project. Join us and let's build it together!

If you'd like to report a bug, help us testing, give feedback, or contribute, reach out to us on GitHub and get involved.

To boost development, you can sponsor the Team through GitHub Sponsors. We hope you enjoy Elk!

Daniel Roe三咲智子 Kevin DengTAKAHASHI ShuujiJoaquín SánchezPatakAnthony Fu

The Elk Team