66
you are viewing a single comment's thread
view the rest of the comments
[-] ebu@awful.systems 15 points 4 months ago

The point is that even if the chances of [extinction by AGI] are extremely slim

the chances are zero. i don't buy into the idea that the "probability" of some made-up cataclysmic event is worth thinking about as any other number because technically you can't guarantee that a unicorn won't fart AGI into existence which in turn starts converting our bodies into office equipment

It's kind of like with the trinity nuclear test. Scientists were almost 100% confident that it wont cause a chain reaction that sets the entire atmosphere on fire

if you had done just a little bit of googling instead of repeating something you heard off of Oppenheimer, you would know this was basically never put forward as serious possibility (archive link)

which is actually a fitting parallel for "AGI", now that i think about it

EDIT: Alright, well this community was a mistake..

if you're going to walk in here and diarrhea AGI Great Filter sci-fi nonsense onto the floor, don't be surprised if no one decides to take you seriously

...okay it's bad form but i had to peek at your bio

Sharing my honest beliefs, welcoming constructive debates, and embracing the potential for evolving viewpoints. Independent thinker navigating through conversations without allegiance to any particular side.

seriously do all y'all like. come out of a factory or something

this post was submitted on 24 Jun 2024
66 points (100.0% liked)

TechTakes

1403 readers
100 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS