The last few years have made one thing clear: intelligence is valuable.
This value extends beyond the raw number-crunching capacity of computers. The sparks of creativity seen in LLMs and the nascent understanding emerging in world models can be sold before either technology approaches a true general understanding of the world. Non-general, domain specific intelligence can be sold.
Frontier AI labs are not content to sell this domain specific intelligence. They are research labs, with head-counts in the thousands and eleven-figure funding rounds, with a clear unifying goal. They are here to create general intelligence.
General intelligence is clearly extremely valuable, being the basis of the entire knowledge labour market. The economic value generated by AGI is comfortably estimated to be in the trillions, and investors are pouring capital into these labs with the expectation of immense returns once this technology is realised.
However, there is a sense in which general intelligence is fungible by design. Being suitably high-performing across domains, running on the same hardware, and trained on broadly the same data results in relatively little room for differentiation. If the underlying software becomes too similar across multiple model providers, then the firms become locked in a competitive equilibrium where profit margins are driven to zero. Even if the technology is transformative, the firms developing it and their investors may see little of this windfall in their annual profit figures.
AGI is still a pipe dream. Decades away.
…or recursive re-ingestion of previous AI generations’ slop which appears on all their input sources will drive AI towards complete, incessant hallucination. Which I’m more inclined to believe will occur, since ‘AI’ isn’t intelligent and at least presently has no way to critically evaluate its inputs, other than by how frequently a given bit of context occurs from multiple sources. And there is a lot of incorrect info out there that is repeated many times, since even people far too often don’t check their sources.
You forgot about ads and social manipulation.




