this post was submitted on 28 May 2025
36 points (72.5% liked)

Fuck AI

2872 readers
895 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
top 20 comments
sorted by: hot top controversial new old
[–] pixxelkick@lemmy.world 42 points 1 day ago (1 children)

It turns out that to plan their ill-fated expedition, the hikers heedlessly followed the advice given to them by Google Maps and the AI chatbot ChatGPT.

Okay?

Proceeds to not elaborate even remotely further on what ChatGPT told them

Oh yeah, super high quality journalism here folks. This entire articles premise boils down to "They asked something (unknown what) of ChatGPT related to this hike, and they got something (unknown what) back, but we're gonna go ahead and mention it and write a whole article about it"

For all I know, they just asked gippity for tips on good ideas for trail mix, who knows! We sure never will because this entire article doesnt actually bother to tell us

FFS, can we please downvote this low quality garbage pretending to be journalism? Give me facts people

[–] SpruceBringsteen@lemmy.world 11 points 1 day ago (1 children)

Pee is stored in the balls

Do women store pee in their balls, too? Does this have anything to do with them wanting pockets?

[–] ramble81@lemm.ee 7 points 1 day ago

Would this qualify them for Darwin Awards?

[–] stoy@lemmy.zip 15 points 1 day ago (1 children)

What surprise me most about stories like these is that, people seem to have lost their sense of self preservation almost completely in the last few years.

Why do you go up a mountain in flip-flops even in the best of weather?

[–] jagermo@feddit.org 13 points 1 day ago (2 children)

People are so used to everything being safe, especially with tourists from the US or middle East.

I saw people in Flipflops going into the Partnachklamm, a 3 hour hike through a gorge, with a secured but still slippery path. Crazyness.

I think the "it has to be secure otherwise they would get sued" mindset is to blame.

[–] WhiteHotaru@feddit.org 10 points 1 day ago

People must be rescued from the Brocken regularly. It is just 1141 m high and you can go there by train or by car. Yet people totally underestimate the weather. It can be sunny at 23 degree C at the bottom, but 20 cm of snow on top.

A story my girlfriend experienced when studying in Geneva: she wanted to do a mountain hike with another girl. Weather was fine in the city. She came prepared with hiking boots and rain gear. The other girl in slippers with a light jacket. They took the cableway to the top and surprise: rain and snow. The other girl bought a totally overpriced jacket and they made a small tour of 20 mins before descending again. My girlfriend was so pissed.

[–] lath@lemmy.world 5 points 1 day ago (2 children)

Partially. It's also a "what could possibly go wrong?" and "looks easy enough, I can do that too" mindsets that tag along.

[–] jagermo@feddit.org 7 points 1 day ago (1 children)

There was one sentence in a documentary about mountain rescue. One of them said "There is this mindset that whatever you want to accomplish, you can achieve. However, the mountains do not care about what you want to accomplish."

We are currently in a trend that you can do everything you set your mind to, and, to be fair, people can do incredible things. But somewhere, there is a limit and I think the open ocean and mountains show you these limits quickly.

[–] lath@lemmy.world 7 points 1 day ago (1 children)

I love it when someone posts about living in the woods away from civilization and most comments recognize how difficult it actually is in practice. Nature is very harsh and a very clear indicator of how incompetent or helpless we can be without our tools and regular comforts.

[–] applemao@lemmy.world 3 points 1 day ago (1 children)

This is why it's nice to dream of an off grid Alaska home, then realize what a pain it would be...or maybe not.. I bet I could do it

[–] lath@lemmy.world 3 points 1 day ago

I'm sure you could do it. It's the how well and for how long that make me waver.

[–] Gigasser@lemmy.world 6 points 1 day ago (1 children)

I remember a guy on Reddit saying that he was going to climb Mount Everest. Motherfucker admitted to being not being physically fit enough, but that he would get fit enough by doing some running. He also had no to little experience in climbing any mountains, and hadn't ever climbed any mountains above 7k meters. He then said he put money on a reserved non-refundable trip to everest in a year to motivate himself. Oh, and all the training and experience gaining was to be done in the year before he went.

Mind you Everest """""can""""" be easy if you hire like multiples guides and plenty of Sherpas to basically drag you up there. But the guy didn't even have that. The absolute confidence some people have when this mountain has taken many lives of much more experienced and veteran climbers already.

At least nobody is stupid enough to climb K2 with no experience. If you've climbed and summited K2, I'd respect you alot more than if you said you climbed and summited Everest by far.

[–] lath@lemmy.world 3 points 1 day ago

At least he recognized there is some difficulty. If he balked halfway and came back in one piece, then it's not money wasted.
Some people just need to feel the pain in order to regain sobriety. Others though, it will be pointless no matter how much you try to convince them otherwise.

[–] FriendOfDeSoto@startrek.website 7 points 1 day ago (2 children)

AI isn't putting people's lives in danger here. It's the people's ignorance that puts their lives in danger. This is the same as when car navigation apps became available and people turned and sank their cars into creeks and harbors because they trusted their navi provider's faulty map data more than their own eyes and common sense. The problem is "cluesless people." If you are just trusting all the info chatGPT finds for you, you are the problem. We can't just outsource the attribution of blame for all idiotic actions to so-called AI.

[–] oo1 5 points 1 day ago (1 children)

I think that depends how convincing and what words the AI uses - and whether it is adequately presenting the risks or uncertainty associated with its suggestions.

An actual intelligent (and empathetic) thing should point out risks when asked such a question, or maybe have the humility to say "that's not my area of expertise and it is dangerous, the best way for you to get up that mountain is to first speak to a real expert". Unless the AI presents itself as "artificial stupidity" then I don't have a problem.

When i know I'm talking outwith my expertise I'd try to make it clear that I'm using my mouth at the other end.

Of course, I didn't see the actual advice that was given, maybe it was adequately qualified, but if a person said "just walk up there, go this way and that way, turn north at the big rock and follow the ridge" - and didn't qualify it by asking about experience in mountain conditions or mentioning risks, I'd call that person stupid negligent or worse. So I'd apply the same to anything presenting itself as "intelligent" which, frankly, I find to be a bit of a red flag when a human claims it too.

I don't think GPS is at fault, insofar as it claims to be a positioning system, it doesn't try to use hyped up bullshit terms like "intelligence".

But "sat nav" is more culpable as it's claiming to "navigate" which to my understanding should involve sensitivity to the terrain, conditions and the traveller/vehicle. If it can't do that then the satnav is also partly at fault - for overselling its capabilities - that's in addition to the driver who also bears ultimate responsibility.

I'm not saying that the people weren't idiots, and they take the ultimate blame. But everyone has a first time experience with mountains and needs to learn as well as build experience. If AI is going to pretend like it is a teacher or adviser then I think it should be sensitive to noobs like a real teacher would.

[–] FriendOfDeSoto@startrek.website 2 points 1 day ago (1 children)

I can agree with a lot here but I also have to admit that I fell at the first hurdle.

I think that depends how convincing and what words the AI uses

Hard disagree here. If you're using so-called AI today, the responsibility to scrutinize everything it throws at you is yours. No matter how neatly packaged or convincingly worded it is. There is a failure rate - the news is full of stories. You're setting off to climb a mountain. You cannot trust the 1s and 0s.

As for the sat nav culpability, Google gives elevation information when they have it. I would not be surprised when we found out that was the case for these dumdums. It's a bit like reading an old paper map though. If you don't know more saturated colors mean higher elevation you might have set off 30 years ago to climb this 12k ft mountain in flip-flops as well. I don't think we should blame sat navs for the ignorance here either. Unless they hide that info maliciously.

[–] oo1 3 points 1 day ago

I think you have to at least feedback to satnav companies for it to maybe get better - whether you call that blame or not I dunno. Experienced navigators will report back to mapping agencies with map corrections too.

What i really don't like about satnavs is that they behave like a navigator, so some people use them as a substitute for one, develop trust, and never learn to develop their own navigation skills.

I can see the same with AI. Not all people have critical thinking like that, some people do trust other people and what they say, and they trust words written like authoritative humans. I wish they wouldn't , but some do seem to. Plenty of times the assistive tool will have plenty of data and give a decent answer about many things, and so build up trust - especially when they communicate in a convincing human like manner.

You can say that's the users fault for being too trusting , being stupid ignorant, or naive, maybe it is, maybe it's nature / nurture / laziness. I just say it's part of the variety of the species some people think differently, some people are more skeptical, some are more trusting and so on. Trust is a useful thing for social animals to have in many cases - it'd be a nightmare to live without it - but its a vulnerability too.

These AI tools, much like marketing people and con-artists and scammers will end up developing and exploiting trust, by accident or by design or by malice, or just by imitation - and I'd rather they didn't. Of course that isn't going to stop them.

I'd just like most of these assistive tools to present their uncertainty better and flag risks better. They seem to just give less info or say less when they're thin on data, that can be a bit dangerous, if it is thin on data it should be saying "I'm out of my comfort zone here, this is a guess, you need to take charge" . Try to prompt people not to get lazy and to try to do some thinking and observation of their own.

I dunno, hopefully more people will become more skeptical and develop more critical thinking skills. But i'm skeptical of that.

[–] stoy@lemmy.zip 3 points 1 day ago

We need some kind of rule of thumb when driving, similar to aviation's "aviate, navigate communicate", but for driving.

"Operate, Communicate, Navigate" perhaps?

*Operate the vehicle *Communicate with other drivers, use your indicators, drive predictably *Navigate to your desitnation

Or in a single paragraph:

Operate the vehicle, make sure you are operating the vehicle with reason and are following the rules. Once you are doing this you communicate withe the other drivers, indicate and drive predictably, Once you are doing this, you can start navigating.

[–] jagermo@feddit.org 5 points 1 day ago

For the Germans here, I recommend "In höchster Not" in the ARD Mediathek, a docu series about the mountain rescue people. It shows how fast things can go wrong on the mountain and what is needed to get people back out.

And yes, apps for "easy tours" are partly to blame.