23
you are viewing a single comment's thread
view the rest of the comments
[-] corbin@awful.systems 9 points 3 months ago

Calling it now: codepoint-level non-tokenizing, with a remapping step to only recognize the most popular thousands of codepoints, would outperform what OpenAI has forced themselves into using. Evidence is circumstantial but strong, e.g. how arithmetic isn't learned right because BPE tokenizers obscure Arabic digits. They can't backpedal on this without breaking some of their API and re-pretraining a model, and they make a big deal about how expensive GPT pretraining is, so they're stuck in their local minimum.

[-] anton@lemmy.blahaj.zone 6 points 3 months ago

But then it can't SolidGoldMagicarp SolidGoldMagicarp SolidGoldMagicarp SolidGoldMagicarp

[-] UnseriousAcademic@awful.systems 4 points 3 months ago

The only viable use case, in my opinion, is to utilise its strong abilities in SolidGoldMagicarp to actualise our goals in the SolidGoldMagicarp sector and achieve increased margins on SolidGoldMagicarp.

this post was submitted on 26 Aug 2024
23 points (100.0% liked)

TechTakes

1488 readers
114 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS