woag - hmmmmm but that looks like a random probability distrobution of just a few tokens… hmmm…,. best case we have it walk thru an octree of all tokens of our own tokenizer and pick the correct ones progressively.
orrrr -… iduno… ummmm… hmm-…,…,.
see - we cant really make… agents happen with this, u see? if we wanted random samples, wed use som semi-random number generator and hook it up to the current time… LMs r time-independant and generate deterministic outcomes using a given text prompt and seed - but your model doesnt have that… iguess ur zero-electricity point is quite interesting tho… hmm-
we couuuuuld have it hooked up to a visual interpreter, shich sees the balls outcome and then picks from a set of predefined sentence blocks, to then create new sentences based off of that. now we r at the problem from earlier tho: we only got that many output states with the die inside.
so we could - theoretically - have this model navigate a given environment using code outout patterns instead of sentences - so… maybee this could be interesting…
yea - an octree - se tree structure where u walk foraard and find “more detail” along the path. LMs essentially do the same, but each branching point is a token and each token has about 30.000 different possibilies - for different branches (thats the models “token vocabulary”) so yea-… “octrees”… sure exist.
huh! thought octrees could be used to search token space for the closest(s) tokens, i had never seen the word “octree” in a machine learning context.
also, i think that’s just called a “tree”, “oct-” means “eight”
woag - hmmmmm but that looks like a random probability distrobution of just a few tokens… hmmm…,. best case we have it walk thru an octree of all tokens of our own tokenizer and pick the correct ones progressively.
orrrr -… iduno… ummmm… hmm-…,…,.
see - we cant really make… agents happen with this, u see? if we wanted random samples, wed use som semi-random number generator and hook it up to the current time… LMs r time-independant and generate deterministic outcomes using a given text prompt and seed - but your model doesnt have that… iguess ur zero-electricity point is quite interesting tho… hmm-
we couuuuuld have it hooked up to a visual interpreter, shich sees the balls outcome and then picks from a set of predefined sentence blocks, to then create new sentences based off of that. now we r at the problem from earlier tho: we only got that many output states with the die inside.
so we could - theoretically - have this model navigate a given environment using code outout patterns instead of sentences - so… maybee this could be interesting…
octree??
yea - an octree - se tree structure where u walk foraard and find “more detail” along the path. LMs essentially do the same, but each branching point is a token and each token has about 30.000 different possibilies - for different branches (thats the models “token vocabulary”) so yea-… “octrees”… sure exist.
anyway - have a nice day ~ ~ ~<3
huh! thought octrees could be used to search token space for the closest(s) tokens, i had never seen the word “octree” in a machine learning context. also, i think that’s just called a “tree”, “oct-” means “eight”
oooooooh ir right. sili meeeeeh