Beep@lemmus.org to Technology@lemmy.worldEnglish · edit-24 days agoAn AI Agent Published a Hit Piece on Metheshamblog.comexternal-linkmessage-square37fedilinkarrow-up1287arrow-down110file-textcross-posted to: technology@lemmy.zipfuck_ai@lemmy.worldfuck_ai@lemmy.world
arrow-up1277arrow-down1external-linkAn AI Agent Published a Hit Piece on Metheshamblog.comBeep@lemmus.org to Technology@lemmy.worldEnglish · edit-24 days agomessage-square37fedilinkfile-textcross-posted to: technology@lemmy.zipfuck_ai@lemmy.worldfuck_ai@lemmy.world
minus-squareSkyezOpen@lemmy.worldlinkfedilinkEnglisharrow-up3·4 days agoI’m hoping it’s an attempt to poison the model and not someone encouraging a fake person to actually take a digital hit. Hell maybe it’s both by accident.
minus-squareGlytch@lemmy.worldlinkfedilinkEnglisharrow-up6·4 days agoChatbots aren’t even close to the level of “fake person” so it’s an attempt to poison the model.
minus-squareToTheGraveMyLove@sh.itjust.workslinkfedilinkEnglisharrow-up4·3 days agoLmao, LLMs aren’t fake people, they’re glorified auto suggestions.
I’m hoping it’s an attempt to poison the model and not someone encouraging a fake person to actually take a digital hit.
Hell maybe it’s both by accident.
Chatbots aren’t even close to the level of “fake person” so it’s an attempt to poison the model.
Lmao, LLMs aren’t fake people, they’re glorified auto suggestions.