In a way, this is like humans. It is easy to add open-class items (like nouns, verbs, and ajectives) to the lexicon. Closed-class words are much harder to make up (pronouns, articles, prepositions). I think there are good information theoretic reasons that this kind of different (continuum really) exists.
@davidmortensen @andrewpiper However, if you're systematic, you can teach it to make up a completely new language, with (I imagine) its own determiners, prepositions, etc. https://maximumeffort.substack.com/p/i-taught-chatgpt-to-invent-a-language
Maximum Effort, Minimum Reward
I Taught ChatGPT to Invent a LanguageIn which ChatGPT and I invent a fictional language spoken by slime-people
@TedUnderwood @andrewpiper This would make sense. After all, the difference between open class and closed class is really not categorical and, in the same way that you can get human to make up neopronouns if you try hard enough, you should be able to get GPT to make up new grammatical words (and grammar).