A 13-year-old girl at a Louisiana middle school got into a fight with classmates who were sharing AI-generated nude images of her

The girls begged for help, first from a school guidance counselor and then from a sheriff’s deputy assigned to their school. But the images were shared on Snapchat, an app that deletes messages seconds after they’re viewed, and the adults couldn’t find them. The principal had doubts they even existed.

Among the kids, the pictures were still spreading. When the 13-year-old girl stepped onto the Lafourche Parish school bus at the end of the day, a classmate was showing one of them to a friend.

“That’s when I got angry,” the eighth grader recalled at her discipline hearing.

Fed up, she attacked a boy on the bus, inviting others to join her. She was kicked out of Sixth Ward Middle School for more than 10 weeks and sent to an alternative school. She said the boy whom she and her friends suspected of creating the images wasn’t sent to that alternative school with her. The 13-year-old girl’s attorneys allege he avoided school discipline altogether.

  • someguy3@lemmy.world
    link
    fedilink
    arrow-up
    74
    arrow-down
    1
    ·
    4 days ago

    Fed up, she attacked a boy on the bus, inviting others to join her. She was kicked out of Sixth Ward Middle School for more than 10 weeks and sent to an alternative school. She said the boy whom she and her friends suspected of creating the images wasn’t sent to that alternative school with her. The 13-year-old girl’s attorneys allege he avoided school discipline altogether.

    When the sheriff’s department looked into the case, they took the opposite actions. They charged two of the boys who’d been accused of sharing explicit images — and not the girl.

  • BlameTheAntifa@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    3 days ago

    The classmates who created and shared them should be arrested and charged with distributing CSAM. It’s unimaginable that this would be tolerated to such an extent and then the victim punished when they are given no other option but to stand up for themselves. This country is sick to it’s core.

    • AA5B@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      3 days ago

      If I can play somewhat the opposite here …… this girl was completely failed by the school system and those parents ought to be demanding serious changes.

      But also schools make what seem unfair actions when they don’t have evidence, can’t identify all the perpetrators and want to get the victim away from her bullies. Even if the school did the right thing about taking it seriously, we probably wouldn’t like their actions

      And even sending the bullies to jail with a kiddie porn conviction may be satisfying but is a bad choice. Bullying your classmates is not really the same as kiddie porn and schools need to find better ways to handle punishment to try to graduate a responsible mature member of society rather than graduate a lifelong criminal

    • Soulg@ani.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      3 days ago

      They should be expelled, but why would children be tried the same as an adult doing this? They’re the same age, it’s not pedophilia, it’s normal expected attraction. Anything beyond expulsion and whatever goes with ai porn harassment between adults is a huge overreaction.

      Would you want two consenting teenagers arrested for csam if they’re texting each other nudes? I would hope not

    • Taldan@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      3 days ago

      Because money is the only thing we, as a country, truly care about. We’re only against things like CP and pedos as long as it doesn’t get in the way of making money. Same reason Trump sharing Larry Nassar and Jeffrey Epstein’s love of “young and nubile” women, as Epstein put it, didn’t kill his political career – he’s the pro-business candidate who makes the wealthy even wealthier

      • Thebeardedsinglemalt@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        3 days ago

        The orange Nazi could be raping a 12 yr old girl on national tv, but say it’s the libs and drag queens who are the rapists, and his cult with put their domestic terrorist hats back on

    • ObjectivityIncarnate@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      3 days ago

      You say this as if the US is the only place generative AI models exist.

      That said, the US (and basically every other) government is helpless against the tsunami of technology in general, much less global tech from companies in other countries.

      • Fedizen@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        3 days ago

        I’m saying why is it so easy for like 12 year olds to find these sites? Its not exactly a pirate bay situation - you can’t generate these kind of AI videos with just a website copied off a USB and an IP address.

        These kind of resources should be far easier to shutdown access to than pirate bay.

    • Fiery@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      3
      ·
      3 days ago

      The problem is that it’s impossible to take out this one application. There doesn’t need to be any actual nude pictures of children in the training set for the model to figure out that a naked child is basically just a naked adult but smaller. (Ofc I’m simplifying a bit).

      Even going further and saying let’s remove all nakedness from our dataset, it’s been tried… And what they found is that removing such a significant source of detailed pictures containing a lot of skin decreased the quality of any generated image that has to do with anatomy.

      The solution is not a simple ‘remove this from the training data’. (Not to mention existing models that are able to generate these kinds of pictures are impossible to globally disable even if you were to be able to affect future ones)

      As to what could actually be done, applying and evolving scanning for such pictures (not on people’s phones though [looking at you here EU].) That’s the big problem here, it got shared on a very big social app, not some fringe privacy protecting app (there is little to do except eliminate all privacy if you’d want to eliminate it on this end)

      Regulating this at the image generation level could also be rather effective. There aren’t that many 13 year old savvy enough to set up a local model to generate there. So further checks at places where the images are generated would also help to some degree. Local generation is getting easier by the day to set up though, so while this should be implemented it won’t do everything.

      In conclusion: it’s very hard to eliminate this, but ways exist to make it harder.

      • papertowels@mander.xyz
        link
        fedilink
        arrow-up
        5
        ·
        3 days ago

        Snapchat allowing this on their platform is the insane part to me. How are they still operating if they’re letting CSAM on the platform??

        • michaelmrose@lemmy.world
          link
          fedilink
          English
          arrow-up
          16
          ·
          3 days ago

          There is no reason to believe Biden is a villain here meanwhile trump was found to be a rapist in court

        • Echo Dot@feddit.uk
          link
          fedilink
          arrow-up
          32
          ·
          3 days ago

          Because the question was political. I’m sorry that you’ve got such a teeny tiny brain that you can’t work out that if somebody asks a political question then the response must demonstrably be political. I don’t know how else to put it.

        • BarneyPiccolo@lemmy.today
          link
          fedilink
          arrow-up
          8
          ·
          3 days ago

          EVERYTHING is political these days, you just get tired of defending corrupt, traitor, racist, misogynist, ignorant, incompetent, PEDOPHILE.

          And ANYONE who supports him are all those same things themselves. Repeat: ALL MAGAs are corrupt, treasonous, racist, misogynist, ignorant, incompetent, and PEDOPHILES.

          That includes YOU. You are defending him, that makes YOU a PEDOPHILE.

          • jve@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 days ago

            That includes YOU. You are defending him, that makes YOU a PEDOPHILE.

            Im with you mostly, but words do mean things, even in this post-fact society.

            • BarneyPiccolo@lemmy.today
              link
              fedilink
              arrow-up
              4
              ·
              3 days ago

              I understand that, which is why I want to make it very clear that anyone who voted for Trump is a Pedophile.

              Don’t like it? Don’t vote for pedophiles.

              • jve@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                5
                ·
                edit-2
                3 days ago

                Sure, bud.

                I guess that makes all voters politicians, then?

                Or just voters that defend politicians?

                Not real clear how this transitive property is supposed to work.

                • BarneyPiccolo@lemmy.today
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  3 days ago

                  No, just voters that are MAGAs, which supports and defends pedophiles as an official tentpost of their party philosophy.

                  It’s simple: Anyone who supports and defends pedophiles is a pedophile. If you vote MAGA, which is ANY right wing/conservative candidate, then you are a Pedophile.

                  It’s so simple, even a MAGA pedophile like you can understand it.

        • prole@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          29
          ·
          3 days ago

          You really want to go down the “which president wants to fuck his daughter” route?

          You sure about that?

    • BarneyPiccolo@lemmy.today
      link
      fedilink
      arrow-up
      18
      arrow-down
      1
      ·
      3 days ago

      Because our country is literally being run by an actual pedophile ring.

      They’d be more likely to want to know how to do it themselves, than to stop it.

    • Duamerthrax@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 days ago

      Guidance Counselors, teachers and administrators don’t like listening to kids anywhere. I use to get in trouble for fighting my bullies when the bullying happened right in front of the teacher.

      • jj4211@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        3 days ago

        Also, they will be razor focused on preserving authority over making things right.

        When they make a mistake, well no they didn’t because to admit a mistake is to acknowledge being fallible and to be fallible is to undermine your authority.

        In this case they still torpedoed her shot at extracurricular activities even after amending in the face of overwhelming data that the girl reasonably felt zero recourse after doing everything the right way to start.

    • juko_kun@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      arrow-down
      5
      ·
      3 days ago

      I mean, law enforcement doesn’t have enough resources to go after people making real CP.

      What makes you think they can go after everyone making fake CP with AI?

      • Phoenixz@lemmy.ca
        link
        fedilink
        arrow-up
        2
        ·
        3 days ago

        They do have resources, especially in the US. They do go after real cp and people go to jail on a near daily basis for it.

        This too, could have been investigated better, which is kind of the point of the article

        Why are you so okay with child pornography? Checking your message history really shows you being completely fine with CP, yet you really have it out for the victim

    • klugerama@lemmy.world
      link
      fedilink
      arrow-up
      32
      arrow-down
      3
      ·
      4 days ago

      What? RTFA. 2 boys were charged by the Sheriff’s department. They didn’t face any punishment from the school, but law enforcement definitely investigated.

    • troglodytis@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      12
      ·
      4 days ago

      Correct. They will not investigate it further than threatening the victims with persecution. The goal is that the victim doesn’t pursue it further.

      They don’t know how to properly investigate it, and they are not interested in knowing. The see it as both ‘kids being kids’ and ‘if this gets out it will give our town a bad name’.

      I’m glad the kid and her family aren’t letting this go!

      • Buelldozer@lemmy.today
        link
        fedilink
        arrow-up
        10
        arrow-down
        2
        ·
        4 days ago

        They will not investigate it further than threatening the victims with persecution.

        Read.The.Whole.Article.

        • troglodytis@lemmy.world
          link
          fedilink
          arrow-up
          5
          arrow-down
          4
          ·
          edit-2
          4 days ago

          Yes, after the kid had to take matters into her own hands.

          She asked for help. The officer said no. She didn’t let it go/escalated the issue as the sexual harassment progressed. Only when forced did they investigate

          • Buelldozer@lemmy.today
            link
            fedilink
            arrow-up
            9
            arrow-down
            3
            ·
            4 days ago

            She asked for help. The officer said no.

            No they didn’t and if they did that information is not in this article. She went to the Guidance Councilor at 7AM then to the onsite Sheriff’s Deputy after. She texted her father and sister about 2PM. The SD couldn’t immediately find anything but it appears that they didn’t stop looking because 3 weeks later they were charging the boys.

            So unless you have another source with a different timeline or more information your originally comment was inaccurate. Sort of like the ragebait headline and the ragebait summary.

            • Clent@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              7
              ·
              4 days ago

              You’re simping hard for the police in here. There is no proof that any of the charges would have occurred had people not become outrage. The school definitely need this pressure.

              You have a lot of cops in your family because I can’t think of a reason anyone would be such a massive cheerleader for professional thugs without some personaon relationship.

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      84
      arrow-down
      11
      ·
      4 days ago

      The article states that the police investigated but found nothing. The kids knew how to hide/erase the evidence.

      Are we really surprised, though? Police are about as effective at digital sleuthing as they are at de-escalation.

      • Buelldozer@lemmy.today
        link
        fedilink
        arrow-up
        25
        arrow-down
        1
        ·
        4 days ago

        The article states that the police investigated but found nothing.

        You should have kept reading.

        "Ultimately, the weeks-long investigation at the school in Thibodaux, about 45 miles (72 kilometers) southwest of New Orleans, uncovered AI-generated nude images of eight female middle school students and two adults, the district and sheriff’s office said in a joint statement.”

      • pelespirit@sh.itjust.works
        link
        fedilink
        arrow-up
        22
        arrow-down
        1
        ·
        4 days ago

        When the sheriff’s department looked into the case, they took the opposite actions. They charged two of the boys who’d been accused of sharing explicit images — and not the girl.

      • IninewCrow@lemmy.ca
        link
        fedilink
        English
        arrow-up
        39
        arrow-down
        2
        ·
        4 days ago

        Unless they can pull out their gun and shoot at something or someone … or tackle someone … they aren’t very good at doing anything else.

        • cyberwitch@reddthat.com
          link
          fedilink
          arrow-up
          18
          arrow-down
          1
          ·
          4 days ago

          Literally verbatim what an officer said when we couldn’t get a hold of animal control and he got sent over instead…

      • mic_check_one_two@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        27
        arrow-down
        3
        ·
        edit-2
        4 days ago

        The article later states that they continued investigating, and found ten people (eight girls and two adults) who were targeted with multiple images. They charged two boys with creating and distributing the images.

        It’s easy to jump on the ACAB bandwagon, but real in-depth investigation takes time. Time for things like court subpoenas and warrants, to compel companies like Snapchat to turn over message and image histories (which they do save, contrary to popular belief). The school stopped investigating once they discovered the kids were using Snapchat (which automatically hides message history) but police continued investigating and got ahold of the offending messages and images.

        That being said, only charging the two kids isn’t really enough. They should charge every kid who received the images and forwarded them. Receiving the images by itself shouldn’t be punished, because you can’t control what other people spontaneously send you… But if they forwarded the images to others, they distributed child porn.

        • wheezy@lemmy.ml
          link
          fedilink
          arrow-up
          2
          arrow-down
          3
          ·
          3 days ago

          At the end of the day, these are children, there is no punishment meaningful that ends with just these boys punished. Justice would be finding the source of who created these images. I’m honestly highly doubtful it was these kids alone. This really should bring into suspect any adult in the life of these boys. An investigation that stops at punishing children for child sexual abuse material is not at all a thorough investigation.

          It’s possible these boys were able to generate these images on their own (meaning not with help from anyone in their real life interactions). But, even if that was the case, the investigation should not stop there.

    • pelespirit@sh.itjust.works
      link
      fedilink
      arrow-up
      22
      arrow-down
      1
      ·
      4 days ago

      When the sheriff’s department looked into the case, they took the opposite actions. They charged two of the boys who’d been accused of sharing explicit images — and not the girl.

    • Buelldozer@lemmy.today
      link
      fedilink
      arrow-up
      52
      arrow-down
      5
      ·
      edit-2
      4 days ago

      Your question was answered in the article but you clearly stopped at either the outrage bait headline or the outrage bait summary.

      “Ultimately, the weeks-long investigation at the school in Thibodaux, about 45 miles (72 kilometers) southwest of New Orleans, uncovered AI-generated nude images of eight female middle school students and two adults, the district and sheriff’s office said in a joint statement.”

      • Echo Dot@feddit.uk
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        4 days ago

        That was the investigation by the police not the school.

        What we’re asking is why the school didn’t investigate given that the police had already been contacted.

        • Logi@lemmy.world
          link
          fedilink
          arrow-up
          6
          arrow-down
          4
          ·
          3 days ago

          Because a school can’t compell Snapchat to release “disappeared” images and chat logs. So perhaps in this case it was best left to the police.

          • Echo Dot@feddit.uk
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            3 days ago

            It wasn’t left to the police she’d already gone to the police. It sounds from the story like the school did literally nothing at all.

            Also you don’t need to compel Snapchat to release the images they’re 13-year-old boys they absolutely have permanent copies on their phones.

            • Yeather@lemmy.ca
              link
              fedilink
              arrow-up
              3
              arrow-down
              2
              ·
              3 days ago

              How can the school compel the boys to show the permanent copies then? I think you are overestimating the power of the school in this scenario.

              • jj4211@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                3 days ago

                The school doesn’t even need to do that to effectively squash suspected behavior in the short term.

                Maybe they can’t dole out a substantive punishment, but when I was growing up they absolutely would lean on kids for even being suspected of doing something, or even if they hadn’t done it yet, but the administration could see it coming. Sure they might of wasted some time on kids that truly weren’t up to anything, but there generally weren’t actual punishments of consequence on those cases. I’m pretty sure that a few things were prevented entirely, just by the kids being told that the administration sees it coming.

                So they should have at least been able to effectively suppress the student body behavior while they worked out the truth.

              • BarneyPiccolo@lemmy.today
                link
                fedilink
                arrow-up
                3
                arrow-down
                1
                ·
                3 days ago

                Saying there is nothing they can do is the standard cop-out for lazy administrators.

                They are minors in school, under the legal supervision of the school. There are LOTS of things a school can do, and courts have been finding mostly on the side of schools for decades.

                Without even trying, I can think of a dozen things the school could have done, including banning phones from the suspects until the investigation is over.

                But they chose to do nothing, them punish the victim when she defended herself, after the school refused.

                • Yeather@lemmy.ca
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  3 days ago

                  Banning phones during the investigation does not give the administration evidence to work with. Even if they took the phones, the school still couldn’t force the students to unlock them. The only way to get the evidence needed was through the police.

        • lightnsfw@reddthat.com
          link
          fedilink
          arrow-up
          8
          ·
          3 days ago

          I mean, the police are the proper individuals to be investigating csam. The school bringing them in immediately would have been the correct action. School officials aren’t trained to investigate crime.

          • BarneyPiccolo@lemmy.today
            link
            fedilink
            arrow-up
            4
            ·
            3 days ago

            Perhaps the cops are the proper investigative arm, but the school system had an obligation to assist in that investigation, and not ignore it, then deny it, then cover it up.

            The entire leadership of the school should be fired, and the principal should be prosecuted.

    • arin@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 days ago

      The boy probably has networked parents, gonna be the future Mark Zuckerberg

        • KairuByte@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          3
          ·
          3 days ago

          No, they usually are. Even kids sharing their own nudes with their SO privately have the book thrown at them, these kids are making CSAM of a third party and distributing them across the school.

  • JTskulk@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    4 days ago

    That’s how this school shit goes, zero tolerance really means zero critical thought. You were involved in a fight? That’s school violence!

    I always tell my personal story when this comes up. In high school I was beefing with my friend over some stupid shit. One day he put me in a headlock and started to fight me. With a free hand I reached up (very big guy btw, I was very small) and grabbed his glasses off his face and squeaked out to let go of me or I’d jam it into his eye. He spun me around and tried to knee me in the nuts (luckily missed) and then yelled at me. We both got suspended for fighting when all I did was get beat up by the guy.

  • Angelevo@feddit.nl
    link
    fedilink
    arrow-up
    5
    arrow-down
    52
    ·
    3 days ago

    Okay,. I hope you have all enjoyed venting your emotions on the matter. Empathy is absolutely appreciated. Now going forward, may I set an example here and ask you all to do similar in the future: Include how it could have been handled better.

    There are many ways to go at a situation like this. Teach the kids some social skills. Just a few off the top:

    • Parenting, social control – kids can speak up and say “yo this is not okay”.
    • Own it: We were born naked. We are naked in our clothes, it is all the same. If you have the confidence, say: “Psh, the real thing is MUCH better”.
    • Turn it around: “What do you need a fake nude for, can you not get a girl to be naked with you? Pathetic…”
    • condescending tone “Aw thanks, you’re thinking about me that much? Cute.” etc.

    I understand, not always easy, more so at that age and dependent on personality. This style can be expanded, make it a thing. Nix the bullshx.

    • postmateDumbass@lemmy.world
      link
      fedilink
      arrow-up
      15
      ·
      3 days ago

      Your points 2, 3, and 4 would be possible options for adults, maybe for 18+.

      But for a 13 year old it is defacto usexualizing a child.

      • Angelevo@feddit.nl
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        The considered ‘appropriate’ age for sex varies across cultures. If we cool our heads and face reality: Many humans start showing interest in sex at even younger ages. Again: Parenting, education.

    • Taldan@lemmy.world
      link
      fedilink
      arrow-up
      20
      arrow-down
      1
      ·
      3 days ago

      Include how it could have been handled better

      The Sheriff could have actually investigated. Snapchat retains copies of all messages sent, so the excuse that the messages are quickly deleted is entirely hollow. This is the production and distribution of CSAM, a very serious crime. The school should have cracked down on phone use while investigating a very serious crime

      Own it: We were born naked. We are naked in our clothes, it is all the same. If you have the confidence, say: “Psh, the real thing is MUCH better”.

      These are children. I cannot emphasize that enough. Children are being used for sexual gratification against their will. Your response to that is “own it”? You may want to review your position here

      Turn it around: “What do you need a fake nude for, can you not get a girl to be naked with you? Pathetic…”

      And in what way would that make the child being sexually exploited whole? It does not stop the person creating it. It does not stop them distributing it. In what way does belittling other children help her?

      To reiterate one more time, this is a 13-year-old girl being sexually exploited against her will, yet here you are sounding like the President of the United States on a trip to Epstein island

    • ByteOnBikes@discuss.online
      link
      fedilink
      arrow-up
      3
      ·
      3 days ago

      God, I hope this doesn’t get deleted because it needs to stay up. It’s a great example of the Dunning-Kruger effect.

      • Angelevo@feddit.nl
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        Interesting and incorrect analysis. Would you like to add some substance, or did you just need to vent?

        • Nikelui@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          3 days ago

          When law enforcement and educators have failed you, maybe violence is the answer sometimes.

          • juko_kun@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            arrow-down
            12
            ·
            3 days ago

            For someone that isn’t violent towards you, anyone else, or your property?

            I dunno… but maybe we were just raised differently. I couldn’t imagine getting into a fight with someone in the real world over this. It’d be very dangerous and it’s just not worth the risk.

            I’d rather ignore them and move on, lol.

            • lightnsfw@reddthat.com
              link
              fedilink
              arrow-up
              3
              ·
              3 days ago

              People like you are the reason the world is in the state it is. If more assholes got the shit stomped out of them whenever they did asshole shit, we’d be a lot better off. Instead you motherfuckers expect everyone to turn the other cheek and get walked all over.

            • Phoenixz@lemmy.ca
              link
              fedilink
              arrow-up
              5
              ·
              3 days ago

              Yeah, you say that easily but nobody is spreading nudes of yours around, AI generated or not.

              Violence isn’t a solution but if it’s the only option left, like it appeared here, it’s justifiable.

              That doesn’t go to mention that, as per usual, in the US, the bullies are protected and the victims get their lives utterly destroyed. This little girl got expelled and had to switch schools because of what others did, and you’re defending the bullies because of what exactly?

              And your “suck it up” advise also seems to be a bit weird, considering this is child pornography.

  • Lemming6969@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    3 days ago

    Nothing is real or can be considered real anymore. We are going to need new frameworks to handle a world where video of illegal or embarrassing things can be trivially created by anyone.

    People saying the Ai vendor should be liable, but that’s short-sighted for a world where anyone can do this at home with largely anonymous distribution.

    • Digit@lemmy.wtf
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Yeah, the people saying that need to check the logic. That’s much like as absurd as saying manufacturers of kitchen knives should be liable for harmful abuses using kitchen knives.

  • SpicyTaint@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    1
    ·
    4 days ago

    Snapchat, an app that deletes messages seconds after they’re viewed

    Says smooth brain idiots who don’t recall the Snappening.

    • glimse@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      4 days ago

      You can call them idiots but it is you who misremembers. Snapchat’s role in the snappening was a failure to crack down on 3rd party clients.

      Snapchat didn’t store the photos (that we know of), a 3rd party app’s server did

      • SpicyTaint@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 days ago

        I can’t misremember if I never read the original story about what happened, lol. But you are technically correct, it looks like it was from a third party saving the data.

        The actual shortsightedness is thinking that data transmitted from any device is temporary. Snapchat would have logs at the very least. Probably chat messages, if not just everything.

    • NotSteve_@piefed.ca
      link
      fedilink
      English
      arrow-up
      17
      ·
      edit-2
      4 days ago

      I’d mainly consider the smooth-brained part the assumption that tech companies willingly delete any data they have access to

      Side note but doesn’t this basically mean Snap has CSAM on their servers?

      • SpicyTaint@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        4 days ago

        I wouldn’t be surprised. No one with the power or authority to do something about it gives a shit, though.

  • Digit@lemmy.wtf
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 days ago

    For years [decades even] I’ve sought (inadequately) to avoid cameras, aware (in part from my art and special effects aptitude and education) this sort of thing and more, were coming. And now it’s here. And maybe now more mice will realise, in the trap, that it’s a trap. :/ Wasn’t an easy thing to try to warn people about, when so many were incapable of conceiving of the threat. Let alone to get sufficient awareness in sufficient people to affect good change to avert such. Though, mostly I’d only considered the threat from government & corporations. Here now it’s in the hands of almost anyone. And far faster and easier. What “fun”. :-|

    • CoffeeTails@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      2 days ago

      I didn’t really think about it until Nano Banana Pro became a thing, before I knew it was possible but it would be so obvious AI. But now? It’s not obvious now. I even consider creating a digital avatar that I use instead of my own face.

      • Digit@lemmy.wtf
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        Yup:

        I even consider creating a digital avatar that I use instead of my own face.

        The notion has popped into my thoughts many times.

        Like a no-skill puppet to hide behind to preserve privacy while still putting stuff out there.

        • CoffeeTails@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 day ago

          Yes exactly!

          It somehow feels more doable now since avatars on other sites are becoming more common, I mostly think of Facebook/Meta and Snapchat. But also generated images that have been trending, “me as action-figure”, “me in ghibli style” etc. and even filters that people use consistently.

          I think people are getting more used to seeing different types of avatars.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    83
    arrow-down
    1
    ·
    4 days ago

    The principal had doubts they even existed.

    Holy shit, this person needs to lose their job. I don’t work in education and I still know that this is a huge problem everywhere.

    • jaselle@lemmy.ca
      link
      fedilink
      arrow-up
      6
      arrow-down
      19
      ·
      edit-2
      4 days ago

      how can we know that in this particular instance they do exist? If this were a reliable way to get someone expelled without any evidence, then if I were a bully I’d accuse other people of making deepfakes of me.

      • CmdrShepard49@sh.itjust.works
        link
        fedilink
        arrow-up
        23
        ·
        4 days ago

        They could ask around about them. Surely one kid would be willing to spill the beans. They’re a bunch of 13-year-olds not criminal masterminds.

          • CmdrShepard49@sh.itjust.works
            link
            fedilink
            arrow-up
            27
            ·
            4 days ago

            Well the police apparently found additional images depicting eight individuals and arrested two boys, so it seems like some tactic along these lines worked. Meanwhile the school administrators threw up their hands, called the situation “deeply complex,” and did nothing but punish the victim.

            Definitely agree that Snapchat is bad along with most social media and especially with kids. I can’t imagine what it’s like growing up in the current era with all this extra bullshit. I was lucky enough to grow up at a time where people didnt have the internet.

            • ChickenLadyLovesLife@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              3 days ago

              I’m a school bus driver and a few years ago I had an incident where some kids threw food at me on the bus (goldfish crackers, of all things). Another kid made a video recording of the incident and posted it online and that caused a huge kerfuffle at the school. The admins couldn’t understand that I didn’t give even the tiniest fuck about the posted video.

              • bthest@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                3 days ago

                “So it doesn’t piss you off that they posted a video? Because now we have to do something about it. And THAT doesn’t that bother you at all?”

                • ChickenLadyLovesLife@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  9
                  ·
                  3 days ago

                  Because now we have to do something about it.

                  Your comment made me realize something. The week prior to this some of the kids threatened to kill me (via dad’s gun and wrapping a plastic bag around my head) and the school did nothing. The goldfish-flinging incident got the kids suspended from the bus for a week. It didn’t occur to me until now that perhaps the admins only did something because of the posted video.

      • jj4211@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        3 days ago

        Well in this particular instance they were able to find them and absolutely confirmed they do exist.

        But to at least consider that risk, they should have at least been able to make the offenders scared they would get found out and they would at least stop actively doing it. They should have been able to squash the behavior even before they could realize a meaningful punishment.

        I know when I was in school they would threaten punishment for things that hadn’t been done yet. I think a lot of kids declined to do something because the school had indicated they knew kids would do something and that would turn out badly.

  • Taldan@lemmy.world
    link
    fedilink
    arrow-up
    53
    ·
    3 days ago

    the images were shared on Snapchat, an app that deletes messages seconds after they’re viewed, and the adults couldn’t find them

    If the Sheriff couldn’t get the images, it’s because he didn’t bother to. It’s a well known fact that Snapchat retains copies of all messages

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      3 days ago

      Is the allegation of CSAM enough to get a warrant? If the Sherrif saw it once then absolutely, but without that?

      Edit: I do imagine if a child testified to receiving it that would be enough to get a warrant for their messages, which would then show it was true, which could then lead to a broader warrant. No child sharing it though would testify to that.