I’ve been working with so many students who turn to it as a first resort for everything. The second a problem stumps them, it’s AI. The first source for research is AI.

It’s not even about the tech, there’s just something about not wanting to learn that deeply upsets me. It’s not really something I can understand. There is no reason to avoid getting better at writing.

  • lohky@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    ·
    23 hours ago

    I hate that LLMs have fucked my ability to find decent documentation. The Internet is done for. I’m learning to garden and do basic electronics from text books now.

    • hardcoreufo@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      14 hours ago

      I don’t know anything about gardening, but for electronics I can recommend practical electronics for inventors and Atari “the book.” Its focused on arcade cabinet repair but definitely has useful info for basic circuit troubleshooting that is aplicable today.

      • lohky@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 hours ago

        I’ve been reading Practical Electronics for Inventors and watching the MIT courses on YouTube.

        Also picked up an Arduino kit and started tinkering, but I’m more interested in circuitry and not coding. My 6-year-old wants to build his own Moog synth because he’s obsessed with Daft Punk and I gotta support that.

    • NickwithaC@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      21 hours ago

      Hopefully not text books that were published in the last 2 years because those risk being written by ai too.

      We’ve reached the carbon dating limit of human knowledge since nothing can now be varied as written by a human unless you personally watched them do it.

  • EzTerry@lemmy.zip
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 day ago

    I don’t know how to solve your core problem you are hinting at without society at large realizing many of our problems are the brainwashing of the masses. This problem is why we initially were taught math without calculators in my day, by college they were expected to help with simple math to focus on the more complicated problems.

    Here with llms it’s important to still write, learn to research something (even more than the don’t use encyclopedias as a primary source) learning to read with deep understanding and learning to skim. Learning math and logic is as important as ever.

    What I see missing quite a bit in the antiai art world is the importance of creating art to convey your meaning (if AI is a tool involved or not for writing images ect is this thing showing the meaning and nuance you want not just a off the top of your head comment and auto ship the slop output) and the only way you can go no that’s not what I want is to have some idea how to make the piece of writing or art yourself even at a high level.

    I personally like the tech but see it accelerating the brain drain for those that rely on it too for answers as the learn.

    • SuspciousCarrot78@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      Yeah, that’s the way I came up too. But I disagree with the “maths without calculators” approach - mainly because it feels like a brute-force solution that ignores the reality that calculators exist.

      So does ChatGPT.

      We should learn to use the tools we have, not pretend they aren’t there.

      More importantly, using something like “do the maths the long way” as a proxy for teaching reasoning probably has limited transfer if it’s not framed explicitly. Like you, I learned a lot of logic through algebra - but no one ever connected those dots. I only realized years later that the real lesson was about reasoning, not just manipulating symbols.

      What I’m getting at is:

      • the tools are already here
      • avoiding them isn’t realistic
      • teaching thinking indirectly through other skills is a pretty unreliable way to transmit it

      If we actually care about developing thinkers, we probably need to teach reasoning, skepticism, and how to interrogate outputs directly, including outputs from tools like AI.

    • Tom Arrr@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      The importance of creating art is to convey your feeling. Conveying your meaning is a nice addition if you like that. How does ai convey its feeling?

      Another thing we will lose to ai along with the ability/desire to learn.

      • EzTerry@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        So a few things : 1 - what is art to the viewer, honestly you can hand craft slop and I don’t want to see it, here it’s the meaning you get the story you the viewer make from it 2 - AI art is bad in two ways in my experience, the story (ie see one) has something critically wrong from the human world, and two most tools today don’t listen to the prompt so no matter how much writting is fed in the result is limited. But if you use it as a tool yes it can make some backgrounds and clip art… The problem is most consumer systems don’t have a good way to put this into a proper editor to finalize a meaningful image… IE put the human story into the final thing.

        But the real point isn’t you made some clipart with image get or a paragraph in your story it’s why it didn’t fit in, contradictions it shows and do you the human know how to fix it? Is there tooling you can direct the llm to do better? (For images there really is not it is not good at partial edits in my experience… Easier to have it generate all parts on their own and layer the final product otherwise something will be wrong)

  • Sivecano@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 hours ago

    One, men turned their thinking over to machines in the hope that this would set them free… But this only allowed for other men with machines to control them.

  • Eggyhead@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    1 day ago

    When I try to do a general search for help on how to solve a problem the top results in most search engines aren’t the old Academy style videos of guides anymore. They are sponsored links, paid tutoring websites, and YouTube videos of people playing at influencer instead of teaching.

    Just wait until the AI companies move on from the onboarding phase and into the enshittification one.

    • HexaBack@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      3
      ·
      23 hours ago

      even worse when those modern video guides purposely include red herrings to throw you off and make you buy their [shitty chatgpt-generated] paid course in the video’s description… 🤦‍♀️

      • Eggyhead@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        20 hours ago

        AI is going to be trained to hawk sponsored goods and services at you as soon as the AI companies figure out how their own software works.

  • sheetzoos@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    20
    ·
    20 hours ago

    AI is only going to become more ubiquitous.

    If you don’t learn to adapt, and regulate your emotions when you encounter it, you’re going to be miserable.

  • ARealAlaskan@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    10 hours ago

    You are so right about how important the process of thinking and learning is, and that is where AI fails.

    I am not a teacher, but a couple weeks ago, I was a guest speaker in a high school IT class. I told them all about how critical it is to be an effective communicator by documenting their steps in their tickets in a way that others can follow, and told them, straight up, that communication is a skill. If you can’t communicate, I will not hire you. Told them I have actively declined to hire or promote because they don’t communicate effectively.

    I am not sure how to do something similar with, say, an English class, but I wonder if you could figure out how to expose them to the future professional repercussions of not understanding the topic deeply. I think it hit differently when the repercussion wasn’t just that their instructor would be unhappy.

    • DavidDoesLemmy@aussie.zone
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 hours ago

      AI is brilliant for learning. Endlessly patient, answers all my questions at a pace that suits me, can combine knowledge for hundreds of different sources to find the right concept, or the best way to explain something. If you’re not able to learn with AI, you’re doing something wrong.

      Just ask it to explain bloom filters to you. Keep asking questions until you get it.

      • ARealAlaskan@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        AI can point you in interesting directions, but if it is your first and only source, and you trust it to combine all these other sources together, you are shorting yourself. It does not do as well as you think it does, at combining ideas, identifying edge cases or real understanding. What it is teaching you may be or may not be, broadly accurate. It is a starting place, which, as I interpreted the OP, was their primary and often only, source.

        The act of forming hypothesis, and researching to understand is part of learning. If all your learning comes from reading tailored answers to specific questions, you miss out on exposure to other thoughts, that you would bump into by researching.

        I’ve used AI to try to research things, and EVERY time, on deeper inspection of an idea, some of the information it shared ranged from false to technically true, but not … really right.

        It is, at best, like a personal TA; someone who you go to the office hours of, when you are stumped on a thing you’ve learned and need the idea explained differently, or you have no idea where to start, and you need a point in the right direction. Helpful, but you would never use that person to write your research.

        • KeenFlame@feddit.nu
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 hours ago

          It has problems with truthfulness but for topics that are well known it can be like having a better search engine or tutor

  • sudoer777@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    9 hours ago

    There is no reason to avoid getting better at writing.

    Having better things to do is a valid reason.

    The first source for research is AI.

    AI with search capabilities is actually helpful for that.

      • sudoer777@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        9 hours ago

        Then ask it for the source for the search results and verify it yourself obviously

  • tostane@thelemmy.club
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    5
    ·
    10 hours ago

    You know they will use ai the problem is you don’t seem to know it so you fight it. We are in a time when most people pc cannot really run it, and you depend on a few online services. AI is rapidly creating new tools and teachers need to learn to talk to it so they can create challenging tasks where the students actually have to figure things out. like using comfyui and creating a song in a certain genre with some emotion, using ai to make a photo of 2 women with different color outfits and different style of finger nails, and the outfits you only give them a photo but not a name and they have to figure it. ai is not easy if you actually try to create something worth creating. students in china are learning to use it at 5 years old.

    • KeenFlame@feddit.nu
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      8 hours ago

      You mean you have become adept at interracial porn prompts and now you think you are an ai whisperer?

  • GaMEChld@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    8
    ·
    edit-2
    12 hours ago

    My thoughts on AI are: I don’t blame guns for gun violence, I don’t blame hammers when a contractor screws up, and I don’t blame AI tools when the student is too dumb to utilize it properly. I’ve been using ChatGPT to great effect, but I’m well aware of what is is equipped to handle and what it is not.

    Else I’d be the type of person to grab a hammer and then rage at the void about how bad hammers are at cooking Thanksgiving dinner.

    • wpb@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      11 hours ago

      Counterpoint: the main product of a student writing a piece is not the piece they wrote, but the act of writing it. If you evaluate the outcome of the situation solely by the piece of paper and the words that are written on it, then the world is a much better place for students using LLMs. But if you evaluate the outcome by the student’s understanding of the subject, then I think we’re better off with the students having to mentally explore the nooks and cranies, footguns and subtleties of the subject. We’re better off with them pursuing a wrong line of thought, realizing it, and having to go back and try again.

      Having a student write a piece – and by this I really mean write a piece, not delegate it or parts of it to a third party – is incredibly beneficial. Annoyingly, our means for checking that a student wrote a piece has always been to look at the words they wrote on a piece of paper, but the words and the paper were never really the point.

      • lightsblinken@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 hours ago

        this is 100% how i feel as well… learning is about teaching you how do something, not the outcome itself. exploring structured thinking, critical thinking, creative thinking, etc (all the hats) is immensely beneficial to developing the mind imho.

      • lps2@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        4
        ·
        11 hours ago

        Counterpoint, it means that writing papers is no longer a good exercise for ensuring students are learning material and teachers need to adapt. AI isn’t going away and it’s a disservice to students to not teach them how to use it, how to find good primary sources, etc

        • wpb@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          11 hours ago

          Ok, so draw the rest of the owl then. What alternative is there right now, ready for use, that will engage the students with the material as well as writing does?

          • sudoer777@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            9 hours ago

            Learning how to construct logical arguments, do research that makes sense, and communicate effectively to the right audience, all of which AI writing sucks at.

        • Enekk@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 hours ago

          Let me ask you something. It is completely possible for a machine to do simple welds, right? Would you say that there is no reason for a welder to practice simple welds since a machine can do it?

          To me, the same is true of writing. Nobody cares about the essay that was written, but it is practicing for writing that people do care about You can’t learn skills like this without doing them.

      • GaMEChld@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        11 hours ago

        You completely missed my point. Everything you said that was bad is them using the tool improperly for the goal you stated. That’s on the student and teacher, not the tool.

        If the goal is learn to write, then the tool should be used to analyze the work you wrote and provide objective criticism so you can refine it. Instructing the AI to just write the final draft of the assignment for you is what I would call using a hammer to cook Thanksgiving dinner.

        Is there a new problem for teachers to figure out how to test for mastery of a subject? Yep. That sucks for them. Teachers always have impossible tasks forced on them by society. I still don’t blame the tool for any of that.

        I think what you actually hate is irresponsible use of tools to nefarious or counterproductive ends.

  • heavy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    11 hours ago

    Let’s go, I also fucking hate this shit, feel like I’m drowning in it. Is this the future we wanted? I fucking hate it.