1291
you are viewing a single comment's thread
view the rest of the comments
[-] Natanael@slrpnk.net 0 points 7 months ago* (last edited 7 months ago)

Not an LLM specifically, in particular lack of backtracking and the network depth limits as well as interconnectivity limits sets hard limits on capabilities.

https://www.lesswrong.com/posts/XNBZPbxyYhmoqD87F/llms-and-computation-complexity

https://garymarcus.substack.com/p/math-is-hard-if-you-are-an-llm-and

https://arxiv.org/abs/2401.11817

https://www.marktechpost.com/2023/08/01/this-ai-research-dives-into-the-limitations-and-capabilities-of-transformer-large-language-models-llms-empirically-and-theoretically-on-compositional-tasks/?amp

Humans have a completely different memory model and a in large part a very different way of linking together learned concepts to form their world view and to develop interdisciplinary skills, allowing us to solve many kinds of highly complex tasks as long as we can keep enough of it in our memory.

this post was submitted on 10 Apr 2024
1291 points (99.0% liked)

Programmer Humor

19589 readers
665 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS