• Novamdomum@fedia.io
    link
    fedilink
    arrow-up
    25
    arrow-down
    31
    ·
    7 days ago

    “Today’s hype will have lasting effects that constrain tomorrow’s possibilities.”

    Nope. No it won’t. I’d love to have the patience to be more diplomatic but they’re just wrong… and dumb.

    I’m getting so sick of these anti AI cultists who seem to be made up of grumpy tech nerds behaving like “I was using AI before it was cool” hipsters and panicking artists and writers. Everyone needs to calm their tits right down. AI isn’t going anywhere. It’s giving creative and executive options to millions of people that just weren’t there before.

    We’re in an adjustment phase right now and boundaries are being re-drawn around what constitutes creativity. My leading theory at the moment is that we’ll all mostly eventually settle down to the idea that AI is just a tool. Once we’re used to it and less starry eyed about it’s output then individual creativity, possibly supported by AI tools, will flourish again. It’s going to come down to the question of whether you prefer reading something cogitated, written, drawn or motion rendered by AI or you enjoy the perspective of a human being more. Both will be true in different scenarios I expect.

    Honestly, I’ve had to nope out of quite a few forums and servers permanently now because all they do in there is circlejerk about the death of AI. Like this one theory that keeps popping up that image generating AI specifically is inevitably going to collapse in on itself and stop producing quality images. The reverse is so obviously true but they just don’t want to see it. Otherwise smart people are just being so stubborn with this and it’s, quite frankly, depressing to see.

    Also, the tech nerds arguing that AI is just a fancy word and pixel regurgitating engine and that we’ll never have an AGI are probably the same people that were really hoping Data would be classified as a sentient lifeform when Bruce Maddox wanted to dissassemble him in “The Measure of a Man”.

    How’s that for whiplash?

    • sudneo@lemm.ee
      link
      fedilink
      English
      arrow-up
      30
      arrow-down
      4
      ·
      7 days ago

      Models are not improving, companies are still largely (massively) unprofitable, the tech has a very high environmental impact (and demand) and not a solid business case has been found so far (despite very large investments) after 2 years.

      That AI isn’t going anywhere is possible, but LLM-based tools might also simply follow crypto, VR, metaverses and the other tech “revolutions” that were just hyped and that ended nowhere. I can’t say it will go one way or another, but I disagree with you about “adjustment period”. I think generative AI is cool and fun, but it’s a toy. If companies don’t make money with it, they will eventually stop investing into it.

      Also

      Today’s hype will have lasting effects that constrain tomorrow’s possibilities

      Is absolutely true. Wasting capital (human and economic) on something means that it won’t be used for something else instead. This is especially true now that it’s so hard to get investments for any other business. If all the money right now goes into AI, and IF this turns out to be just hype, we just collectively lost 2, 4, 10 years of research and investments on other areas (for example, environment protection). I am really curious about what makes you think that that sentence is false and stupid.

      • VoterFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        4
        ·
        7 days ago

        Models are not improving? Since when? Last week? Newer models have been scoring higher and higher in both objective and subjective blind tests consistently. This sounds like the kind of delusional anti-AI shit that the OP was talking about. I mean, holy shit, to try to pass off “models aren’t improving” with a straight face.

        • sudneo@lemm.ee
          link
          fedilink
          English
          arrow-up
          9
          ·
          7 days ago

          There is a bunch of research showing that model improvement is marginal compared to energy demand and/or amount of training data. OpenAI itself ~1 month ago mentioned that they are seeing a smaller improvements in Orion (I believe) vs GPT4 than there was between GPT 4 and 3. We are also running out of quality data to use for training.

          Essentially what I mean is that the big improvements we have seen in the past seem to be over, now improving a little cost a lot. Considering that the costs are exorbitant and the difference small enough, it’s not impossible to imagine that companies will eventually give up if they can’t monetize this stuff.

          • icecreamtaco@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            6 days ago

            Compare Llama 1 to the current state of the art local AI’s. They’re on a completely different level.

            • sudneo@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              6 days ago

              Yes, because at the beginning there was tons of room for improvement.

              I mean take openAI word for it: chatGPT 5 is not seeing improvement compared to 4 as much as 4 to 3, and it’s costing a fortune and taking forever. Logarithmic curve, it seems. Also if we run out of data to train, that’s it.

          • theherk@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 days ago

            Surely you can see there is a difference between marginal improvement with respect to energy and not improving.

            • sudneo@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              6 days ago

              Yes, I see the difference as in hitting the logarithmic tail that shows we are close to the limit. I also realize that exponential cost is a defacto limit on improvement. If improving again for chatGPT7 will cost 10 trillions, I don’t think it will ever happen, right?

    • GHiLA@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      9
      ·
      7 days ago

      It’s fucking fantastic news, tbh.

      Here’s my take, let them dismiss it.

      Let em! Remember Bitcoin at $15k after 2019?

      Let em! And it’s justified! If Ai isn’t important right now, then why should its price be inflated to oblivion? Let it fall. Good! Lower prices for those of us that do see the value down the road.

      That’s how speculative investment works. In no way is this bad. Are sales bad? Sit back and enjoy the show.

      • Voroxpete@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        1
        ·
        7 days ago

        Are sales bad?

        Of AI products? By all available metrics, yes, sales for AI driven products are atrocious.

        Even the biggest name in AI is desperately unprofitable. OpenAI has only succeeded in converting 3% of their free users to paid users. To put that on perspective, 40% of regular Spotify users are on premium plans.

        And those paid plans don’t even cover what it costs to run the service for those users. Currently OpenAI are intending to double their subscription costs over the next five years, and that still won’t be enough to make their service profitable. And that’s assuming that they don’t lose subscribers over those increased costs. When their conversion rate at their current price is only 3%, there’s not exactly an obvious appetite to pay more for the same thing.

        And that’s the headline name. The key driver of the industry. And the numbers are just as bad everywhere else you look, either terrible, or deliberately obfuscated (remember, these companies sank billions of capex into this; if sales were good they’d be talking very openly and clearly about just how good they are).

        • GHiLA@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 days ago

          That’s if you’re still in the camp that climate change isn’t politically impossible.

          at least at this point

          If I were more positive about the situation, I’d agree entirely, but… I don’t think we’re gonna make it, man.