The same way they are trying to regulate and limit 3d printers we already have at our homes. the same way they buy software kill it and then discontinue it for future hardware. But i feel the death for open source models will be perpetrated by laws and the gradual operating system surveillance and denying costumers of hardware. There is already manufacture of consent for that goal; on one side Chinese open models as a “security risk” on the other the generation of ilegal material by open source uncensored models. And then What good is open source when there is no ecosystem to sustain it?
But to me that doesn’t matter at all because anything made by gen ai is useless trash anyway.
Also, training models requires expensive hardware. So now you have a bunch of folks that can run LLMs on their system, but over time those models need to keep learning. The hardware requirements for training are far more intense. It’s cost-prohibitive across the board. That’s when you have people waiting for some angel with the hardware to get the models up to speed, and that’s often times some corporate entity (the one that shaved off a smidge for open source and is now putting the leash around your neck). Zero of this is sustainable. AI never gets cheaper.
Right now. In ten to twenty years this won’t be the case. Also consider the diminishing returns from adding more hardware to the problem of training AI: Despite monopolizing the entire world’s supply of DRAM, AI models are only gaining marginal improvements.
The curve of hardware expense against model capabilities is moving to intersect unless something drastic changes. Big AI needs a huge breakthrough in order to stay ahead of that curve. I don’t see that happening because all the big breakthroughs that are happening now are in regards to efficiency which makes things worse for them and makes training cheaper, faster.
It’s an interesting concept and there are tools that achieve it on a local network, but it’s not Folding at Home. It would be a massive security hellscape IMO, and it’s not worth the insane power requirements for what folks are doing with it. You’d be hard pressed to find people that would be willing to lend power, bandwidth, and hardware to that, especially when it doesn’t have a meaningful focused purpose.
Also LLM models are built on diminishing returns and constant hype cycles. This is part of the business model. Even in the open source LLM scene folks are constantly looking for the new release, and i often hear them get tired of an llm repeating the same patterns over and over again. And then there is the public that start to recognize the speech patters of a certain model or when something is made with an llm. It is already affecting the credibility of some magazines and individual journalists.
Open source image generation depends on buckets of loras and refinement. tons of training for the vending machine. So this idea of having your favorite ol’ isolated model on your pc for years of personal use sounds incompatible with the logic of this technology.
The same way they are trying to regulate and limit 3d printers we already have at our homes. the same way they buy software kill it and then discontinue it for future hardware. But i feel the death for open source models will be perpetrated by laws and the gradual operating system surveillance and denying costumers of hardware. There is already manufacture of consent for that goal; on one side Chinese open models as a “security risk” on the other the generation of ilegal material by open source uncensored models. And then What good is open source when there is no ecosystem to sustain it?
But to me that doesn’t matter at all because anything made by gen ai is useless trash anyway.
Also, training models requires expensive hardware. So now you have a bunch of folks that can run LLMs on their system, but over time those models need to keep learning. The hardware requirements for training are far more intense. It’s cost-prohibitive across the board. That’s when you have people waiting for some angel with the hardware to get the models up to speed, and that’s often times some corporate entity (the one that shaved off a smidge for open source and is now putting the leash around your neck). Zero of this is sustainable. AI never gets cheaper.
Right now. In ten to twenty years this won’t be the case. Also consider the diminishing returns from adding more hardware to the problem of training AI: Despite monopolizing the entire world’s supply of DRAM, AI models are only gaining marginal improvements.
The curve of hardware expense against model capabilities is moving to intersect unless something drastic changes. Big AI needs a huge breakthrough in order to stay ahead of that curve. I don’t see that happening because all the big breakthroughs that are happening now are in regards to efficiency which makes things worse for them and makes training cheaper, faster.
Most people have PCs in their homes of which 99% of their computational power goes unused. Is there a reason training models couldn’t be done p2p?
It’s an interesting concept and there are tools that achieve it on a local network, but it’s not Folding at Home. It would be a massive security hellscape IMO, and it’s not worth the insane power requirements for what folks are doing with it. You’d be hard pressed to find people that would be willing to lend power, bandwidth, and hardware to that, especially when it doesn’t have a meaningful focused purpose.
Also LLM models are built on diminishing returns and constant hype cycles. This is part of the business model. Even in the open source LLM scene folks are constantly looking for the new release, and i often hear them get tired of an llm repeating the same patterns over and over again. And then there is the public that start to recognize the speech patters of a certain model or when something is made with an llm. It is already affecting the credibility of some magazines and individual journalists.
Open source image generation depends on buckets of loras and refinement. tons of training for the vending machine. So this idea of having your favorite ol’ isolated model on your pc for years of personal use sounds incompatible with the logic of this technology.