themachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 13 天前A Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.coexternal-linkmessage-square99fedilinkarrow-up1585arrow-down123cross-posted to: fuck_ai@lemmy.worldpulse_of_truth@infosec.pubtechnology@lemmy.zip
arrow-up1562arrow-down1external-linkA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.cothemachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 13 天前message-square99fedilinkcross-posted to: fuck_ai@lemmy.worldpulse_of_truth@infosec.pubtechnology@lemmy.zip
minus-squarebobzer@lemmy.ziplinkfedilinkEnglisharrow-up1arrow-down4·12 天前Why say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?
Why say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?