• TheObviousSolution@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    2 days ago

    local LLMs are not a replacement for Opus

    https://www.bitdoze.com/best-open-source-llms-claude-alternative/

    Something tells me you haven’t even made the effort. They are not that good, in the same way that LibreOffice is not as good as Excel. But if you are going to make the argument you quote, then you can work that brain muscle and adapt.

    And they aren’t training off of the Internet because they are training on your input. It’s mind-boggling to me how some people are so willing to train their replacements while also paying them for the effort to do so for an advantage set very temporary in the future we are heading. A lot of your criticism doesn’t even apply to local LLMs - either they are trained by model distillation from more advances models or because they are images temporally set in stone. It’s also telling how implicitly willing you seem to be able to let the Internet burn, because the inevitability is becoming a corporate slave and accepting their ever increasing subscription fees which you can’t ignore because “hey, they’ve got the most users, the Internet is too dead, your open alternatives are no replacements for us”. You say you are not, but you are saying everything an AI AGI astrosurfer would be saying, and the irony of hearing this in an open source “federated” platform over something like Reddit is paramount.

    • Evotech@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      Sorry but it’s not even slightly comparable.

      Frontier models vs whatever you can realistically host on your own that is.

      • TheObviousSolution@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 day ago

        That you don’t want to or aren’t able to compare them doesn’t mean they can’t be compared. You do you, or more aptly, have an AI do you since you can’t bother.

        • Evotech@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 day ago

          Oh I’ve tried. Don’t assume I haven’t

          In terms of functionality on paper it’s similar. In terms of what they can realistically do it’s not.

    • BillyTheKid2@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      I could have worded that differently, I apologize.

      They aren’t a replacement for somebody like me who doesn’t have a screaming GPU.

      Yes they train on input. I don’t like it either. It’s not just creepy, but I’m sure breaks privacy laws everywhere.

      Regardless, you’ve already decided who I am so I don’t see this conversation being productive.

      I again apologize for not making my previous comment more straightforward.

      • TheObviousSolution@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        2 days ago

        Oh, I don’t think I know who you are, I just think it’s indiscernible.

        They aren’t a replacement for somebody like me who doesn’t have a screaming GPU.

        You can run small LLMs that are still surprisingly good purely on modern CPUs, although I’m sure that’s part of the intent of trying to lock down supplies behind the bubble.