What LLMs revealed is how many people in our industry don’t like to code.

It’s intriguing that now they claim and showcase what they “built with Claude”, whereas usually that means they generated a PoC.

It’s funny, as people still focus on how they’re building, so it’s all about the code. And if that’s the message sent outside, together with the thought that LLMs are already better than “average coder Joe”, then the logical follow-up question is: why do we need those humans in the loop?

  • PokerChips@programming.dev
    link
    fedilink
    arrow-up
    5
    ·
    2 days ago

    There’s a guy in my niche showcasing his website “built with Claude”. Built in a couple weeks. Irks the hell out of me. Looks nice on the surface. Meanwhile, I’m building from scratch and spending half a year.

    • ExLisper@lemmy.curiana.net
      link
      fedilink
      arrow-up
      8
      ·
      1 day ago

      Why are you building your website? You need to do something specific? You can use AI. Make the site, use it, forget about it.

      You want to learn new tools? You just like coding? You want to grow and maintain the app? Write it yourself. You will learn, have fun and end up with something you can maintain for long time.

      Coding is not a race. Yes, people will make simple apps faster with AI but making something that works fast is not always the goal.

    • Pycorax@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 day ago

      For what it’s worth, when they eventually raise the prices on it, you’d be the one not losing money for every line you write.

      • communism@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        22 hours ago

        Except you can already download and run models on your local machine for free with ollama. Price raising might at least calm the AI craze with the normies though. Probably not with developers who know how to run LLMs locally.

        • lad@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          21 hours ago

          Quality of output differs by a lot for local models, but I also think that local should be the way forward