Critically, systems such as ChatGPT do not understand the world they depict or describe. They do not possess semantic knowledge, meaning they can’t understand facts or concepts such as what “inflation” means or what a “street protest” looks like.

Instead, the machines are pattern-modelling engines that predict what content would most plausibly complete or correspond to a given prompt. In sum, the AI output is simply a function of scale and training data – not comprehension.