Right now. In ten to twenty years this won’t be the case. Also consider the diminishing returns from adding more hardware to the problem of training AI: Despite monopolizing the entire world’s supply of DRAM, AI models are only gaining marginal improvements.
The curve of hardware expense against model capabilities is moving to intersect unless something drastic changes. Big AI needs a huge breakthrough in order to stay ahead of that curve. I don’t see that happening because all the big breakthroughs that are happening now are in regards to efficiency which makes things worse for them and makes training cheaper, faster.
Right now. In ten to twenty years this won’t be the case. Also consider the diminishing returns from adding more hardware to the problem of training AI: Despite monopolizing the entire world’s supply of DRAM, AI models are only gaining marginal improvements.
The curve of hardware expense against model capabilities is moving to intersect unless something drastic changes. Big AI needs a huge breakthrough in order to stay ahead of that curve. I don’t see that happening because all the big breakthroughs that are happening now are in regards to efficiency which makes things worse for them and makes training cheaper, faster.