themachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 13 days agoA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.coexternal-linkmessage-square99fedilinkarrow-up1585arrow-down123cross-posted to: fuck_ai@lemmy.worldpulse_of_truth@infosec.pubtechnology@lemmy.zip
arrow-up1562arrow-down1external-linkA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.cothemachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 13 days agomessage-square99fedilinkcross-posted to: fuck_ai@lemmy.worldpulse_of_truth@infosec.pubtechnology@lemmy.zip
minus-squareGoodlucksil@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up4·12 days agoMaterial. Type of material: Image
minus-squarebobzer@lemmy.ziplinkfedilinkEnglisharrow-up1arrow-down4·12 days agoWhy say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?
Material. Type of material: Image
Why say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?