What LLMs revealed is how many people in our industry don’t like to code.

It’s intriguing that now they claim and showcase what they “built with Claude”, whereas usually that means they generated a PoC.

It’s funny, as people still focus on how they’re building, so it’s all about the code. And if that’s the message sent outside, together with the thought that LLMs are already better than “average coder Joe”, then the logical follow-up question is: why do we need those humans in the loop?

  • Pycorax@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 day ago

    For what it’s worth, when they eventually raise the prices on it, you’d be the one not losing money for every line you write.

    • communism@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      22 hours ago

      Except you can already download and run models on your local machine for free with ollama. Price raising might at least calm the AI craze with the normies though. Probably not with developers who know how to run LLMs locally.

      • lad@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        21 hours ago

        Quality of output differs by a lot for local models, but I also think that local should be the way forward