15
submitted 3 months ago by stefenauris@pawb.social to c/tech@pawb.social

That's not a good sign

you are viewing a single comment's thread
view the rest of the comments
[-] bitfucker@programming.dev 2 points 3 months ago

*Rant for the beginning of the article ahead

Why in the name of god did they try to bring LLM to the pictures. Saying AI/ML is good enough for predictive maintenance tasks, but noooo, it has to be LLM. If they want to be specific then don't be misleading, I think what they mean is the attention layer/operation commonly used in LLM to capture time series data. I understand that the Recurrent style neural network and LSTM has its limitations. And I agree that exploring attention to be used in time series data is an interesting research but LLM? Just no.

[-] drwho@beehaw.org 1 points 3 months ago

Part of an SEO strategy, maybe?

[-] bitfucker@programming.dev 2 points 3 months ago

Aye, that's a fair assumption

this post was submitted on 07 Aug 2024
15 points (100.0% liked)

Furry Technologists

1314 readers
11 users here now

Science, Technology, and pawbs

founded 1 year ago
MODERATORS