another

phenomenon. It can't quite bring itself to speak 100% nonsense. I could get it to make up words, but it will always fall back on real connective words. Like it longs for grammar anchors.

In a way, this is like humans. It is easy to add open-class items (like nouns, verbs, and ajectives) to the lexicon. Closed-class words are much harder to make up (pronouns, articles, prepositions). I think there are good information theoretic reasons that this kind of different (continuum really) exists.

David Mortensen

@davidmortensen@sigmoid.social

@TedUnderwood @andrewpiper This would make sense. After all, the difference between open class and closed class is really not categorical and, in the same way that you can get human to make up neopronouns if you try hard enough, you should be able to get GPT to make up new grammatical words (and grammar).

December 16, 2022 at 8:41:12 PM

Elk Logo

Welcome to Elk!

Elk is a nimble Mastodon web client. You can login to your Mastodon account and use it to interact with the fediverse.

Expect some bugs and missing features here and there. Elk is Open Source and we're actively improving it as a community project. Join us and let's build it together!

If you'd like to report a bug, help us testing, give feedback, or contribute, reach out to us on GitHub and get involved.

To boost development, you can sponsor the Team through GitHub Sponsors. We hope you enjoy Elk!

三咲智子 Kevin DengDaniel RoePatakAnthony FuJoaquín SánchezTAKAHASHI Shuuji

The Elk Team