How small can a conlang get ?
How small can a conlang get ?
Toki pona. End of discussion.
Well, maybe, but i've been thinking about turning tokens into words and all that fun stuff (disclaimer, i am against any kind of llm and the similar misuses of that technology) and most importantly, about how directions in this high dimensional space seem to encode true semantical dimensions.
Like how king + woman - man = king + "vector direction associated with feminity = queen (higly simplified)
This brought me to the realization that all languages have a measurable (maybe not exact but at least rough estimate) number of semantic dimensions (usually way lower than the number of words in said language).
Which then made me wonder :
-> How few semantic dimensions do you need for a functionnal conlang ? (i imagine it would be two (binary) but i would be happy to hear your counterpoints)
-> how many words per semantic dimensions do you need to get by and is there a reason why human language have so much "redundancy" (why not have "word for magnitude + word for semantic direction" ad nauseam ?)
And last but not least, can you make a language with only 3 semantic dimensions and speak in rgb colours ?
TlDR : how many semantic direction do you need to make a language ?
Per comment request, here are some links if you found this interesting and want to learn more :
About turning words into vectors :
About conlangs :
-> The official toki pona website