Maven (famous)@lemmy.zip to Programmer Humor@programming.dev · 5 months agoSimple Optimization Tricklemmy.zipimagemessage-square67fedilinkarrow-up1899arrow-down15
arrow-up1894arrow-down1imageSimple Optimization Tricklemmy.zipMaven (famous)@lemmy.zip to Programmer Humor@programming.dev · 5 months agomessage-square67fedilink
minus-squareHugeNerd@lemmy.calinkfedilinkarrow-up23·5 months agoHave you seen the insane complexity of modern CPUs? Ain’t no one hand coding that like a 6502 in 1985.
minus-squareskuzz@discuss.tchncs.delinkfedilinkarrow-up6·5 months agoEven if one did, say using x86, it would still just be interpreted by the CPU into the CPU’s native opcodes, as the legacy instruction sets are interpreted/translated.
minus-squareBigDanishGuy@sh.itjust.workslinkfedilinkarrow-up2·5 months ago as the legacy instruction sets are interpreted/translated. Wth? That’s it, I’m sticking to the AVR then
minus-squaremusubibreakfast@lemmy.worldlinkfedilinkarrow-up10·5 months agoI wonder if there’s anyone alive right now who would be capable of such a task.
minus-squareBlackmist@feddit.uklinkfedilinkEnglisharrow-up10·5 months agoIf the hardware was fixed, I don’t see why not. Might not be as fast as the optimisations compilers do these days though. If you have to support thousands of types of GPU and CPU and everything else, then fuck no.
Have you seen the insane complexity of modern CPUs? Ain’t no one hand coding that like a 6502 in 1985.
Even if one did, say using x86, it would still just be interpreted by the CPU into the CPU’s native opcodes, as the legacy instruction sets are interpreted/translated.
Wth? That’s it, I’m sticking to the AVR then
I wonder if there’s anyone alive right now who would be capable of such a task.
If the hardware was fixed, I don’t see why not.
Might not be as fast as the optimisations compilers do these days though.
If you have to support thousands of types of GPU and CPU and everything else, then fuck no.