QuentinCallaghan@sopuli.xyzM to Political Memes@lemmy.caEnglish · 11 months agoMeanwhile at DeepSeeksopuli.xyzimagemessage-square133fedilinkarrow-up1982arrow-down115
arrow-up1967arrow-down1imageMeanwhile at DeepSeeksopuli.xyzQuentinCallaghan@sopuli.xyzM to Political Memes@lemmy.caEnglish · 11 months agomessage-square133fedilink
minus-square474D@lemmy.worldlinkfedilinkarrow-up1·11 months agoI mean obviously you need to run a lower parameter model locally, that’s not a fault of the model, it’s just not having the same computational power
minus-squareAtHeartEngineer@lemmy.worldlinkfedilinkEnglisharrow-up2·11 months agoIn both cases I was talking about local models, deepseek-r1 32b parameter vs an equivalent that is uncensored from hugging face
I mean obviously you need to run a lower parameter model locally, that’s not a fault of the model, it’s just not having the same computational power
In both cases I was talking about local models, deepseek-r1 32b parameter vs an equivalent that is uncensored from hugging face