63
wow. sensible (awful.systems)
you are viewing a single comment's thread
view the rest of the comments
[-] Soyweiser@awful.systems 20 points 5 months ago

Yes, we know (there are papers about it) that for LLMs every increase of capabilities we need exponentially more data to train it. But don't worry, we only consumed half the worlds data to train LLMs, still a lot of places to go ;).

this post was submitted on 09 Jun 2024
63 points (100.0% liked)

TechTakes

1427 readers
102 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS