RobotToaster@mander.xyz to Not The Onion@lemmy.worldEnglish · 10 days agoOpenAI says dead teen violated TOS when he used ChatGPT to plan suicidearstechnica.comexternal-linkmessage-square96fedilinkarrow-up1630arrow-down18cross-posted to: fuck_ai@lemmy.worldtechnology@lemmy.worldnottheonion@sh.itjust.worksaboringdystopia@mander.xyznottheonion@lemmy.ml
arrow-up1622arrow-down1external-linkOpenAI says dead teen violated TOS when he used ChatGPT to plan suicidearstechnica.comRobotToaster@mander.xyz to Not The Onion@lemmy.worldEnglish · 10 days agomessage-square96fedilinkcross-posted to: fuck_ai@lemmy.worldtechnology@lemmy.worldnottheonion@sh.itjust.worksaboringdystopia@mander.xyznottheonion@lemmy.ml
minus-squareJoe@lemmy.worldlinkfedilinkEnglisharrow-up11·9 days agoIt certainly should be designed for those type of queries though. At least, avoid discussing it. Wouldn’t ChatGPT be liable if someone planned a terror attack with it?
It certainly should be designed for those type of queries though. At least, avoid discussing it.
Wouldn’t ChatGPT be liable if someone planned a terror attack with it?