23
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 26 Aug 2024
23 points (100.0% liked)
TechTakes
1403 readers
92 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
I remember one time in a research project I switched out the tokeniser to see what impact it might have on my output. Spent about a day re-running and the difference was minimal. I imagine it's wholly the same thing.
*Disclaimer: I don't actually imagine it is wholly the same thing.
there's a research result that the precise tokeniser makes bugger all difference, it's almost entirely the data you put in
because LLMs are lossy compression for text
latent space go brrrr