- cross-posted to:
- Technology@programming.dev
- fuck_ai@lemmy.world
- cross-posted to:
- Technology@programming.dev
- fuck_ai@lemmy.world
cross-posted from: https://programming.dev/post/36866515
Comments
cross-posted from: https://programming.dev/post/36866515
Comments
No that’s only a tiny part of what LLMs do.
When you enter a sentence, it first parses the sentence to obtain vectors, then it ranks the vectors, then it vectors down to a database, then it reconstructs the sentence from the information its obtained.
But what is truth ? As Lionel Huckster would say.
Most of these so-called “hallucinations” are not errors at all. What has happened is that people have had multiple entries and they have only posted the last result.
For instance, one example was where Gemini suggested cutting the legs off couch to fit it into a room. What the poster failed to reveal was that they were using Gemini to come up with solutions to problems in a text adventure game…