view the rest of the comments
Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Please don't post about US Politics.
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
This is why I don't trust LLMs for programming advice. I suck at programming and tools like ChatGPT would be great if it could actually translate what I want into something that I could just plug into my existing code and run with. Instead, I get answers to questions that reference the API of an entirely different programming language, make up fake functions, or just don't operate the way I described, if at all.
Maybe some of my problems with AI are just "skill issue" and I need to figure out how to phrase shit correctly just like how you had to know exactly how to tickle search engines back in the day by not asking a question verbatim but plugging in keywords to have it give you what you actually wanted instead of some nonsense that it thought you wanted. We called it "Google-Fu", but it has become less important now with SEO.
Also, I feel like LLMs are just creatively bankrupt. Case in point, I have a friend who is leaning on AI tools to help craft his next homebrew D&D campaign, and I thought that was a great use of that technology so I tried it out as well and, well... it ended up generating a lot of the same narrative that he got from it, including re-using proper nouns for places/people. Everything was just so generic fantasy and boring, even when you fed it your own ideas it just spit back out regurgitated fantasy tropes and stuff that sounds like it could have come out of a setting guide somewhere (and probably did if it was trained on that dataset).