• defunct_punk@lemmy.world
    link
    fedilink
    English
    arrow-up
    45
    ·
    edit-2
    6 months ago

    This is the same “I’ll do my own research, thanks” crowd btw

    spoonfeed me harder Silicon Valley VC daddy

  • Avicenna@lemmy.worldM
    link
    fedilink
    arrow-up
    11
    ·
    6 months ago

    Oh no not the reading! Great thing we had AI to create AI and we did not have to depend on all those computer scientists and engineers whose only skill is to read stuff.

  • karashta@fedia.io
    link
    fedilink
    arrow-up
    34
    ·
    6 months ago

    Imagine being proud of wasting the time drinking coffee instead of reading and understanding for yourself…

    Then posting that you are proud of relying on hallucinating, made up slop.

    Lmfao.

    • CarrotsHaveEars@lemmy.ml
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      6 months ago

      -Look at you. Spent four years in a college. Six months to go through the documentation of the programming language. Another six months to read the manual of the library and practice those example code. Finally, three months to implement the feature and complete the automated tests. Meanwhile, I write a prompt in thirty seconds and AI gives me the whole project, in a programming language I don’t know, and with me not knowing any of the technical detail.

      -And somehow you are proud of that?

      • supersquirrel@sopuli.xyz
        link
        fedilink
        arrow-up
        4
        ·
        6 months ago

        -And somehow you are proud of that?

        Further, I find it EXTREMELY disturbing that someone would desire the secrets of our wonderous journey to be so cynical, solvable and perfectly designed for authoritarian consolidation of power.

        • Soup@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          6 months ago

          The LLM will eventually steal the code, though, and people will claim it invented something.

  • Overkrill@midwest.social
    link
    fedilink
    arrow-up
    26
    ·
    6 months ago

    you read books and eat vegetables like a loser

    my daddy lets me play nintendo 64 and eat cotton candy

    we are not the same

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    arrow-up
    24
    ·
    6 months ago

    “I used many words to ask the AI to tell me a story using unverified sources to give me the answer I want and have no desire to fact check.”

    GIGO.

    • stabby_cicada@slrpnk.net
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      6 months ago

      I mean, how many people fact check a book? Even at the most basic level of reading the citations, finding the sources the book cited, and making sure they say what the book claims they say?

      In the vast majority of cases, when we read a book, we trust the editors to fact check.

      AI has no editors and generates false statements all the time because it has no ability to tell true statements from false. Which is why letting an AI summarize sources, instead of reading those sources for yourself, introduces one very large procedurally generated point of failure.

      But let’s not pretend the average person fact checks anything. The average person decides who they trust and relies on their trust in that person or source rather than fact checking themselves.

      Which is one of the many reasons why Trump won.

      • RememberTheApollo_@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        This is a two part problem. The first is that LLMs are going to give you shoddy results riddled with errors. This is known. Would you pick up a book and take it as the truth if analysis of the author’s work said 50% of their facts are wrong?The second part is that the asker has no intent to verify the LLM’s output, they likely just want the output and be done with it. No critical thinking required. The recipient is only interested in a copy-paste way of transferring info.

        If someone takes the time to actually read and process a book with the intent of absorbing and adding to their knowledge, mentally they take the time to balance what they read with what they know and hopefully cross referencing that information internally and gauging it with “that sounds right” at least, but hopefully by reading more.

        These are not the same thing. Books and LLMs are not the same. Anyone can read the exact same book and offer a critical analysis. Anyone asking an LLM a question might get an entirely different response depending on minor differences in asking.

        Sure, you can copy-paste from a book, but if you haven’t read it, then yeah…that’s like copy-pasting an LLM response. No intent of learning, no critical thought, etc.

    • pulsewidth@lemmy.world
      link
      fedilink
      arrow-up
      16
      ·
      6 months ago

      The additional hour might be the time they have to work so that they can pay for the LLM access.

      Because that is another aspect of what LLMs really are, another Silicon Valley rapid-scale venture capital money-pit service hoping that by the time they’ve dominated the market and spent trillions they can turn around and squeeze their users hard.

      Only trouble for fighting this with logic is that the market they’re attempting to wipe out is people’s ability to assess data and think critically.

      • defunct_punk@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        6 months ago

        Indeed. Folks right now dont understand that their queries are being 99.9% subsidized by trillions in VC hoping to dominate a market. Tech tale as old as time and people are falling for it hook, line, and sinker

    • d00ery@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      6 months ago

      Impressed that he can think of the information he needs in 2 minutes - why even bother researching if you already know what you need …

      Seriously though, reading and understanding generally just leaves me with more, very relevant, questions and some answers.

  • hperrin@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    6 months ago

    In other words: I don’t understand why someone would want to think when being lazy is available to them.

  • Estradiol Enjoyer @lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    19
    ·
    6 months ago

    while you were studying books, he studied a cup of coffee. TBH I can spend an hour both reading and drinking coffee at the same time idk why it’s got to be its own thing.

    • ironhydroxide@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      6 months ago

      Look at this guy over here, bragging about multitasking. Next he’ll tell us he can drink coffee and write multiple prompts in an hour. /s

  • AbsolutelyNotAVelociraptor@sh.itjust.works
    link
    fedilink
    arrow-up
    44
    ·
    6 months ago

    I’ve seen this at work.

    We installed a new water sampler and they sent an official installer to set up and commission the device. The guy couldn’t answer a damn question about the product without chatGPT. When I asked a relatively complex question that the bot couldn’t answer (that was at the third question), I decided that I had enough and spend an hour reading the manual of the thing. Turns out the bot was making up the answers and I learned how to commission the device without the “official support”.

  • lath@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    6 months ago

    "I ran this Convo through an LLM and it said i should fire and replace you with an LLM for increased productivity and efficiency.

    Oh wait, hold on. I read that wrong, it said I should set you on fire…

    Well, LLMs can’t be wrong so…"