• Phoenixz@lemmy.ca
    link
    fedilink
    arrow-up
    16
    ·
    4 days ago

    So by that reasoning all Microsoft software is open source

    Not that we’d want it, it’s horrendously bad, but still

  • Alberat@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    4 days ago

    windows engineers have probably been copying snippets from stackoverflow for decades, which may have been copied from the kernel or some other copyleft product

    • sunbeam60@feddit.uk
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      3 days ago

      How may Windows engineers have you met?

      99% of the ones I met and worked with were very, very good.

      Don’t confuse maintaining backwards compatibility and managing real concerns of large customers with bad engineering.

      • Alberat@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        3 days ago

        you can still be a good engineer and still copy code from stack exchange. i wasn’t saying windows engineers are bad

  • meekah@discuss.tchncs.de
    link
    fedilink
    arrow-up
    31
    ·
    4 days ago

    Aren’t you all forgetting the core meaning of open source? The source code is not openly accessible, thus it can’t be FOSS or even OSS

    This just means microslop can’t enforce their licenses, making it legal to pirate that shit

    • the_artic_one@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 days ago

      It’s just the code that’s not under copyright, so if someone leaked it you could legally copy and distribute any parts which are AI generated but it wouldn’t invalidate copyright on the official binaries.

      If all the code were AI generated (or enough of it to be able to fill in the blanks), you might be able to make a case that it’s legal to build and distribute binaries, but why would you bother distributing that slop?

      • m0stlyharmless@lemmy.zip
        link
        fedilink
        arrow-up
        2
        ·
        4 days ago

        Even if it were leaked, it would still likely be very difficult to prove that any one component was machine generated from a system trained on publicly accessible code.

  • ZILtoid1991@lemmy.world
    link
    fedilink
    arrow-up
    75
    ·
    4 days ago

    I think, to punish Micro$lop for its collaboration with fascists and its monopolistic behavior, the whole Windows codebase should be made public domain.

      • bitwolf@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        4 days ago

        We don’t have to use it for anything other than compatibility.

        Personally I’d very much like for ACPI to be un-fucked

      • MonkderVierte@lemmy.zip
        link
        fedilink
        arrow-up
        15
        ·
        edit-2
        4 days ago

        The kernel and NTFS seem decent from what i heard. Or at least was (the kernel, no guess what they vibecoded into it now).

        About NTFS: it was actually pretty good for it’s time (90s), but the tooling makes no use of some of it’s better features and abuses some others close to breaking point. Literally pearls for the sows.

        • skip0110@lemmy.zip
          link
          fedilink
          arrow-up
          5
          ·
          4 days ago

          There were decent (at least, worked for me) NTFS drivers for Linux like 20 years ago. (Back when I felt the need to dual boot)

  • Evil_Shrubbery@thelemmy.club
    link
    fedilink
    arrow-up
    18
    ·
    4 days ago

    By that same logic LLMs themselves (by now some AI bro had to vibe code something there) & their trained datapoints (which were on stolen data anyway) should be public domain.

    What revolutionary force can legislate and enforce this?? Pls!?

    • CanadaPlus@lemmy.sdf.org
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      3 days ago

      By that same logic LLMs themselves (by now some AI bro had to vibe code something there)

      I’m guessing LLMs are still really really bad at that kind of programming. The packaging of the LLM, sure.

      & their trained datapoints

      For legal purposes, it seems like the weights would be generated by the human-made training algorithm. I have no idea if that’s copyrightable under US law. The standard approach seems to be to keep them a trade secret and pretend there’s no espionage, though.

      • Evil_Shrubbery@thelemmy.club
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        4 days ago

        The packaging of the LLM, sure.

        Yes, totally, but OP says a small bit affects “possibly the whole project” so I wanted to point out that includes prob AIs, Windows, etc too.

  • explodicle@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    11
    ·
    4 days ago

    Oh darn, our CEO told us to use LLMs to write all this code, and now the good parts might be used for something that helps people. Not our copyrights!

  • Michal@programming.dev
    link
    fedilink
    arrow-up
    29
    arrow-down
    1
    ·
    4 days ago

    Counterpoint: how do you even prove that any part of the code was AI generated.

    Also, i made a script years ago that algorithmically generates python code from user input. Is it now considered AI-generated too?

    • Wiz@midwest.social
      link
      fedilink
      arrow-up
      19
      arrow-down
      4
      ·
      4 days ago

      i made a script years ago that algorithmically generates python code from user input. Is it now considered AI-generated too?

      No, because you created the generation algorithm. Any code it generates is yours.

      • skami@sh.itjust.works
        link
        fedilink
        arrow-up
        8
        ·
        4 days ago

        Not how I understand it, but I’m not a lawyer. The user that uses the script to generate the code can copyright the output and oop can copyright their script (and the output they themself generate). If it worked like you said, it would be trivial to write a script that generates all possible code by enumerating possible programs, then because the script will eventually generate your code, it’s already copyrighted. This appear absurd to me.

        Relevant: https://www.vice.com/en/article/musicians-algorithmically-generate-every-possible-melody-release-them-to-public-domain/

        If the script copies chunks of code under the copyright of the original script writer, I typically see for those parts that the original owner keeps copyright of those chunks and usually license it in some way to the user. But the code from the user input part is still copyrightable by the user. And that’s that last part that is most interesting for the copyright of AI works. I’m curious how the law will settle on that.

        I’m open to counterarguments.

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 days ago

      Computer output cannot be copyrighted, don’t focus on it being “AI”. It’s not quite so simple, there’s some nuance about how much human input is required. We’ll likely see something about that at some point in court. The frustrating thing is that a lot of this boils down to just speculation until it goes to court.

    • Dyskolos@lemmy.zip
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      4 days ago

      Guess you can’t really prove that, unless you leave comments like “generated by Claude” in it with timestamp and whatnot 😁 Or one can prove that you are unable to get to that result yourself.

      So nonsense, yes.

      • VeryVito@lemmy.ml
        link
        fedilink
        arrow-up
        8
        ·
        4 days ago

        Or one can prove that you are unable to get to that result yourself.

        Oh shit… I’ve got terabytes of code I’ve written over the years that I’d be hard-pressed to even begin to understand today. The other day I discovered a folder full of old C++ libraries I wrote 20+ years ago, and I honestly don’t remember ever coding in C++.

          • VeryVito@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            4 days ago

            True enough, and I expected to get checked on that.

            Regardless… along with the archives, assets and versioned duplicates, my old projects dating back to the 90s somehow now fill multiple TB of old hard drives that I continue to pack-rat away in my office. Useless and pointless to keep, but every piece was once a priority for someone.

      • mattvanlaw@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        4 days ago

        Cursor, an ai/agentic-first ide, is doing this with a blame-style method. Each line as it’s modified, added DOES show history of ai versus each human contributor.

        So, not nonsense in probability, but in practice – no real enforcement to turn the feature on.

        • Rooster326@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          4 days ago

          Why would you ever want this?

          If you pushed the bug that took down production - they aren’t gonna whataboutism the AI generated it. They’re still going to fire you.

          • sunbeam60@feddit.uk
            link
            fedilink
            arrow-up
            2
            ·
            4 days ago

            It makes little difference IMHO. If you crash the car, you can’t escape liability blaming self driving.

            Likewise, if you commit it, you own it, however it’s generated.

            • mattvanlaw@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              4 days ago

              It’s mainly for developers to follow decisions made over many iterations of files in a code base. A CTO might crawl the gitblame…but it’s usually us crunchy devs in the trenches getting by.

          • mattvanlaw@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            4 days ago

            Sorry, but as another reply: pushing bugs to production doesn’t immediately equate to firing. Bug tickets are common and likely addressing issues in production.

              • mattvanlaw@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                3 days ago

                I guess you mean like full outtage for all users? My bad just a lot of ways to take the verb “down” for me. Still, though, what a crappy company to not learn but fire from that experience!

    • sunbeam60@feddit.uk
      link
      fedilink
      arrow-up
      3
      ·
      4 days ago

      OP is obviously ignorant of how much tooling has already helped write boiler plate code.

      Besides AI code is actually one of the things that’s harder to detect, compared to prose.

      And all that said, AI is doing an amazing job writing a lot of the boilerplate TDD tests etc. To pretend otherwise is to ignore facts.

      AI can actually write great code, but it needs an incredibly amount of tests wrapped around and a strict architecture that it’s forced to stick to. Yes, it’s far too happy sprinkling magic constants and repeat code, so it needs a considerable amount of support to clean that up … but it’s still vastly faster to write good code with an AI held on a short leash than it is to write good code by hand.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    12
    ·
    4 days ago

    That’s terrible news. There’s no way I want my code to be open source. Then other people would see just how much spaghetti you can have in a codebase and still have it run.

  • JATth@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 days ago

    Stick to the GPL licensencing of your code whenever possible and the garbage EEE can’t subdue you. (Embrace extend exthinguish.)

    If they plagiarize it they kinda ow you the honor.

    Hower, plagiarism is still plagiarism, so you better actually write some of your code by hand.

    • nibbler@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      4 days ago

      if you can’t enforce copyright, how do you stop others from giving it away for free and editing it, making it foss…?

        • JackbyDev@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 days ago

          This is why CC0 should not be used for code. Its public license fallback explicitly does not give patent rights. Compare that to MIT which implicitly does by saying you can use the software however you want. CC0 literally has this clause in the public license fallback.

          No trademark or patent rights held by Affirmer are waived, abandoned, surrendered, licensed or otherwise affected by this document.

  • Kokesh@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    4
    ·
    4 days ago

    As it should. All the idiots calling themselves programmers, because they tell crappy chatbot what to write, based on stolen knowledge. What warms my heart a little is the fact that I poisoned everything I ever wrote on StackOverflow just enough to screw with AI slopbots. I hope I contributed my grain of sand into making this shit little worse.

    • DeathsEmbrace@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      4 days ago

      Do it in a way that a human can understand but AI fails. I remember my days and you guys are my mvp helping me figure shit out.

      • Chakravanti@monero.town
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        3 days ago

        Most “humans” don’t understand reality. So you’re postulative challenge invention isn’t going find a break you seek to divine. Few exist. I’m yet to find many that can even recognize the notion that this language isn’t made to mean what think you’re attempting to finagle it into.

        Evil Money Right Wrong Need…

        Yeah…I could go on and on but there’s five sticks humans do not cognate the public consent about the meaning of Will Never be real. Closest you find any such is imagination and the only purpose there is to help the delirious learn to cognate the difference and see reality for what it may be.

        Good fucking luck. Half the meat zappers here think I am an AI because break the notion of consent to any notion of a cohesive language. I won’t iterate that further because I’ve already spelt out why.