When was the last time that you found a recipe by following or listening to your friend or family, rather than looking it up online?
All learning is personal. It is taught from someone. Books often give the illusion of obtaining skills and knowledge from an impersonal source, but even the most solitary learning experience involves at least one other person: the author, authors, community of practice, or the distributed cognition of collective intelligence.
ChatGPT and other generative AI are not impersonal, although they look like it. Their knowledge is not acquired impersonally, but trained personally and based on the collective intelligence of bodies of work published and produced by exclusively human intelligence.
We don’t really know what impersonal or non-human knowledge looks like, or even if it’s at all possible. We move from illusion to illusion, simulacrum to simulacrum and back. We are sold a story of machine learning and intelligence which is neither machine nor intelligent. Instead, it is derivative in the sense of being drawn from pre-existing sources and synthetic in the sense of being processed, like food produced from natural ingredients transformed to the point of self-alienation. Which doesn’t make them any less natural, in the hard sense of the word natural. Even the most synthetic materials are based on natural products, and machine intelligence is in no way different.
Generative AI is fundamentally natural language processing. Natural as in natural. And language is the manipulation of common symbols. Common as in common to a population of humans.
So next time we stand in awe at the output of a language AI, we’d better remember the invisible humans doing human magic behind the scenes, across time and space. The curtain may fall, but it lifts up again. And again.