37
submitted 1 month ago by tonytins@pawb.social to c/tech@pawb.social

You might've noticed that ChatGPT — and AI in general — isn't good at math. There's a reason, and it has to do with how modern AI is built.

Basically, they're autocorrect on steroids. Which some of us have been saying for, like, ages.

all 10 comments
sorted by: hot top controversial new old
[-] some_guy@lemmy.sdf.org 22 points 1 month ago

There's zero ability to compute problems. Only statistical word salad. AI is a scam.

[-] SloanTheServal@pawb.social 12 points 1 month ago* (last edited 1 month ago)

LLMs (I refuse to call them AI, as there's no intelligence to be found) are simply random word sequence generators based on a trained probability model. Of course they're going to suck at math, because they're not actually calculating anything, they're just dumping what their algorithm "thinks" is the most likely response to user input.

"The ability to speak does not make you intelligent" - Qui-Gon Jin

[-] baldingpudenda@lemmy.world 12 points 1 month ago

Why can't this person working on his linguist degree not able to do high level math? It's not their specialty.

[-] tonytins@pawb.social 20 points 1 month ago

To be fair, even someone with a linguist degree knows basic math. GPT can't even get that right. That's the biggest problem (and red flag).

[-] CliveRosfield@lemmy.world 0 points 1 month ago

GPT can’t get basic math right? That hasn’t been my experience whatsoever.

[-] Zexks@lemmy.world -1 points 1 month ago

It can absolutely do basic math. Give us a ‘basic’ math question that it can’t solve.

[-] LapGoat@pawb.social 0 points 1 month ago

Ive heard the chatgpt math problem was fixed in the new one by having it write a python code to complete the math problem and then providing the answer when the code is run.

this post was submitted on 03 Oct 2024
37 points (87.8% liked)

Furry Technologists

1312 readers
9 users here now

Science, Technology, and pawbs

founded 1 year ago
MODERATORS