this post was submitted on 14 Sep 2025
12 points (92.9% liked)

TechTakes

2161 readers
77 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

top 41 comments
sorted by: hot top controversial new old

Sneer inspired by a thread on the preferred Tumblr aggregator subreddit.

Rationalists found out that human behavior didn't match their ideological model, then rather than abandon their model or change their ideology decided to replace humanity with AIs designed to behave the way they think humans should, just as soon as they can figure out a way to do that without them destroying all life in the universe.

[–] blakestacey@awful.systems 9 points 18 hours ago (2 children)

Regarding occasional sneer target Lawrence Krauss and his co-conspirators:

Months of waiting but my review copy of The War on Science has arrived.

I read Krauss’ introduction. What the fuck happened to this man? He comes off as incapable of basic research, argument, basic scholarship. [...] Um... I think I found the bibliography: it's a pdf on Krauss' website? And all the essays use different citation formats?

Most of the essays don't include any citations in the text but some have accompanying bibliographies?

I think I'm going insane here.

What the fuck?

https://bsky.app/profile/nateo.bsky.social/post/3lyuzaaj76s2o

[–] V0ldek@awful.systems 5 points 4 hours ago

All of those people, Krauss, Dawkins, Harris (okay that one might've been unsalvageable from the start, I'm really not sure) are such a great reminder that you can be however smart/educated you want, the moment you believe you're the smartest boi and stop learning and critically approaching your own output you get sucked into the black hole of your asshole, never to return.

Like if I had a nickel. It's hubris every time. All of those people need just a single good friend that, from time to time, would tell them "man, what you said was really fucking stupid just now" and they'd be saved.

Clout is a proxy of power and power just absolutely rots your fucking brain. Every time a Guy emerges, becomes popular, clearly thinks "haha, but I am different, power will not rot MY brain", five years later boom, he's drinking with Jordan Benzo Peterson. Even Joe Fucking Rogan used to be significantly more lucid before someone gave him ten bazillion dollars for a podcast and he suffered severe clout poisoning.

[–] nightsky@awful.systems 11 points 15 hours ago (1 children)

Huh, I wonder who this Krauss guy is, haven't heard of him.

*open wikipedia*

*entire subsection titled "Allegations of sexual misconduct"*

*close wikipedia*

[–] blakestacey@awful.systems 8 points 14 hours ago (1 children)

image descriptionScreenshot of Lawrence Krauss's Wikipedia article, showing a section called "Controversies" with subheadings "Relationship with Jeffrey Epstein" followed by "Allegations of sexual misconduct". Text at https://en.wikipedia.org/wiki/Lawrence_Krauss#Controversies

[–] nightsky@awful.systems 2 points 3 hours ago

Always so many coincidences.

[–] swlabr@awful.systems 9 points 20 hours ago (2 children)

OK. So, this next thing is pretty much completely out of the sneerosphere, but it pattern matches to what we’re used to looking at: a self-styled “science communicator” mansplaining a topic they only have a reductive understanding of: Hank Green gets called out for bad knitting video

archive

[–] V0ldek@awful.systems 6 points 4 hours ago

TIL Hank Green, the milquetoast BlueSky poster, also has some YouTube channel. How quaint.

I think every time I learn That Guy From BlueSky also has some other gig different from posting silly memes I lose some respect for them.

E.g. I thought Mark Cuban was just a dumb libertarian shitposter, but then it turned out he has a cuntillion dollars and also participated in a show unironically called "Shark Tank" that I still don't 100% believe was a real thing because by god

[–] V0ldek@awful.systems 3 points 4 hours ago (1 children)

What's up with all the websites that tell me "you've reached the limit of free articles for the month" even though I've literally never entered that site before in my life. Stop gaslighting me you cunts.

Anyway, here's the archive

[–] megaman@discuss.tchncs.de 1 points 34 minutes ago

The limit is zero, that's all.

I disabled javascript and the popup went away

[–] jaschop@awful.systems 4 points 19 hours ago (2 children)

Does anyone have a good definition or classic examples for the term mall ninja at the ready?

I first heard that term on this channel, and I feel like I should understand that phenomenon better.

[–] sleepundertheleaves@infosec.pub 10 points 14 hours ago* (last edited 14 hours ago) (2 children)

This is (a copy of) the original Mall Ninja thread from decades ago, featuring "the Sergeant of a three-man Rapid Tactical Force at one of America’s largest indoor retail shopping areas".

https://lonelymachines.org/mall-ninjas/

Fair warning: you will leave the site both enlightened and nearly dead of cringe.

[–] V0ldek@awful.systems 4 points 4 hours ago

Median age is usually 19-25

Good news, now they're 40-50 and severely divorced!

Thank you for sharing this bit of internet deep lore. Now I just need to find the four hour youtube video of some ex-GI gun nut explaining in exhausting detail exactly how bullshit every detail of those stories is because whatever the fuck is going on there is fascinating.

[–] froztbyte@awful.systems 6 points 19 hours ago (2 children)

clueless and enthusiastic (often overly so), getting real into something but often at the lower end rungs

aiui the term it started its life as a description of people who’d get real into weapons, but only at the grade you can buy in mall mass retail. never dug into the history tho

[–] Architeuthis@awful.systems 6 points 15 hours ago

I feel that strip mall dojos where you were ostensibly taught some very mainstream belt-based martial art like karate or TKD (or straight up make-believe stuff like ninjutsu) but were essentially glorified daycare should figure somewhere in the history of the term.

[–] Soyweiser@awful.systems 7 points 19 hours ago* (last edited 1 hour ago)

It also comes from a mall cop (a very USA sort of concept) who was extremely afraid of getting shot at his job (more so than regular cops at the time) and who overreacted massively and wanted all kinds of weird gun attachments iirc. Sadly this paranoia is something that the US cops also suffer from now. Causing everybody to suffer.

E: wow I had misremembered how crazy the story was.

[–] rook@awful.systems 10 points 1 day ago (4 children)

Woke up to some hashtag spam this morning

AI’s Biggest Security Threat May Be Quantum Decryption

which appears to be over of those evolutionary “transitional forms” between grifts.

The sad thing is the underlying point is almost sound (hoarding data puts you at risk of data breaches, and leaking sensitive data might be Very Bad Indeed) but it is wrapped up in so much overhyped nonsense it is barely visible. Naturally, the best and most obvious fix — don’t hoard all that shit in the first place — wasn’t suggested.

(it also appears to be a month-old story, but I guess there’s no reason for mastodon hashtag spammers to be current 🫤)

[–] nightsky@awful.systems 6 points 22 hours ago (2 children)

Is there already a word for "an industry which has removed itself from reality and will collapse when the public's suspension of disbelief fades away"?

Calling this just "a bubble" doesn't cut it anymore, they're just peddling sci-fi ideas now. (Metaverse was a bubble, and it was stupid as hell, but at least those headsets and the legless avatars existed.)

I would actually contend that crypto and the metaverse both qualify as early precursors to the modern AI post-economic bubble. In both cases you had a (heavily politicized) story about technology attract investment money well in excess of anyone actually wanting the product. But crypto ran into a problem where the available products were fundamentally well-understood forms of financial fraud, significantly increasing the risk because of the inherent instability of that (even absent regulatory pressure the bezzle eventually runs out and everyone realizes that all the money in those 'returns' never existed). And the VR technology was embarrassingly unable to match the story that the pushers were trying to tell, to the point where the next question, whether anyone actually wanted this, never came up.

GenAI is somewhat unique in that the LLMs can do something impressive in mimicking the form of actual language or photography or whatever it was trained on. And on top of that, you can get impressively close to doing a lot of useful things with that, but not close enough to trust it. That's the part that limits genAI to being a neat party trick, generating bulk spam text that nobody was going to read anyways, and little more. The economics don't work out when you need to hire someone skilled enough to do the work to take just as much time double-checking the untrustworthy robot output, and once new investment capital stops subsidizing their operating costs I expect this to become obvious, though with a lot of human suffering in the debris. The challenge of "is this useful enough to justify paying its costs" is the actual stumbling block here. Older bubbles were either blatantly absurd (tulips, crypto) or overinvestment as people tried to get their slice of a pie that anyone with eyes could see was going to be huge (railroad, dotcom). The combination of purely synthetic demand with an actual product is something I can't think of other examples of, at this scale.

[–] BlueMonday1984@awful.systems 2 points 20 hours ago

Is there already a word for “an industry which has removed itself from reality and will collapse when the public’s suspension of disbelief fades away”?

If there is, I haven't heard of it. To try and preemptively coin one, "artificial industry" ("AI" for short) would be pretty fitting - far as I can tell, no industry has unmoored itself from reality like this until the tech industry pulled it off via the AI bubble.

Calling this just “a bubble” doesn’t cut it anymore, they’re just peddling sci-fi ideas now. (Metaverse was a bubble, and it was stupid as hell, but at least those headsets and the legless avatars existed.)

I genuinely forgot the metaverse existed until I read this.

[–] froztbyte@awful.systems 4 points 23 hours ago

linkedin thotleedir posts directly into your mailbox? gonna have to pour one out for you

AI’s Biggest Security Threat May Be Quantum Decryption

an absolutely wild grab-bag of words. the more you know about each piece, the more surreal the sentence becomes. unintentional art!

[–] swlabr@awful.systems 6 points 1 day ago (1 children)

It’s a financial security threat, you see

[–] froztbyte@awful.systems 5 points 23 hours ago

that's why you should keep your at-risk data on quantum ai blockchain!!~

[–] BlueMonday1984@awful.systems 2 points 1 day ago

Naturally, the best and most obvious fix — don’t hoard all that shit in the first place — wasn’t suggested.

At this point, I'm gonna chalk the refusal to stop hoarding up to ideology more than anything else. The tech industry clearly sees data not as information to be taken sparingly, used carefully, and deleted when necessary, but as Objective Reality Units^tm^ which are theirs to steal and theirs alone.

[–] gerikson@awful.systems 6 points 23 hours ago* (last edited 20 hours ago) (3 children)

"Enjoy" this Rat fundamentally misunderstanding Banks:

https://www.boristhebrave.com/2025/09/14/the-culture-novels-as-a-dystopia/

JFC the comments on LW are even worse..

https://www.lesswrong.com/posts/uGZBBzuxf7CX33QeC/the-culture-novels-as-a-dystopia#comments

While the Culture is, on pretty much any axis, strictly superior to modern civilization, what personally appalls me is their sheer deathism.

If memory serves, the average human lives for around 500 years before opting for euthanasia, mostly citing some kind of ennui. What the hell? 500 years is nothing in the grand scheme of things.

"Why didn't Iain take my neuroses into account??"

Marvelous! This makes more sense of the culture than the books do.

"please sir may I pleasure you sexually"

[–] DonPiano@feddit.org 5 points 3 hours ago (1 children)

But one of my hobbies is “oppositional reading” – deliberately interpreting novels counter to the obvious / intended reading.

Proceeds to just make new things up and misunderstand inconsequential aspects of things that were already there

[–] DonPiano@feddit.org 2 points 3 hours ago (1 children)

I view all the books through a in-universe lens and thus will not consider that things are as they are because the narrative needs it

But then they can't imagine a large population having, largely, some traits in common due to cultural mechanisms rather than genetic engineering, and conclude that this fiction contains elements because their narrative of the world needs it. Amazing.

[–] gerikson@awful.systems 2 points 2 hours ago* (last edited 2 hours ago)

I should know the answer to this because I re-read all the Culture novels last year, but I do think there's some genetic engineering in the Culture. There's the famous sex glands, of course (but maybe the neural net handles part of that too?) and then there's the asocial dude on the remote asteroid in Excession, who I believe was seen as a genetic throwback from the general population.

But it's beside the point, Banks probably included genetic engineering to make sure no-one got horrible diseases and could live to 500 years, not to breed a separate race of elites. And for that he can never be forgiven by these idiots.

Edit both HN and LW comments mention John C Wright, who I have never read and vaguely remembered being a Sad Puppy. He has some dreck where everything is libertarian. Banks was a socialist, but he was foremost a novelist. Faced with the need to create a future society, he naturally designed one with no disease, no material wants, and lots and lots of sex. Who wouldn't? Conservative yanks, that's who.

[–] Soyweiser@awful.systems 9 points 19 hours ago* (last edited 1 hour ago)

“Why didn’t Iain take my neuroses into account??”

Yes, why didn't he take the neuroses of normal people into account. Normal people who spend 90% of their day worrying about the acausalrobotgod killing everybody.

Strikes me as they simply have never talked to normal people about immortality like that, even in a post scarcity world, lot of people simply don't feel like it would be worth their time to live forever in that.

Edit:

But one of my hobbies is “oppositional reading” – deliberately interpreting novels counter to the obvious / intended reading. And it’s not so clear to me that the Culture is all it is cracked up to be.

This isn't oppositional reading. This is an often discussed thing in the novels. So much so that the novels have counterarguments for a lot of the regular 'the culture is bad' arguments.

Anyway the article is so bad I wonder how well this person can read.

Edit part 2, can't let things go shouting electronic booo:

Sociopaths

They mention that sociopaths either get a 24/7 drone on them to guard them if they do crimes (more a general criminal thing) or if they are more megalomanic sociopaths they get to run out their desires in a virtual world (Which I assume runs a lot like the modern game Rust, where a subset of the playerbase seems to love to make 14 year old boys cry, going from the yt vids I saw). If this isn't enough, they will need to convince a Mind to help them. Because all large machines in The Culture are intelligent. Good luck with that. Also The Culture is not something like the glitter belt of Revelation Space, where somebody can sign a contract to give away their voting rights or something. So the power of a sociopath is already limited.

not solved alignment

They both have, and it doesn't matter. They have because any mind-equiv mind who goes mad gets destroyed (they literally need to take care of large group of humans or go mad, they have symbiotic relationship with humanity), or if they try to go foom they sublime, in the culture universe, sublimation is inevitable. (Therein also lies the real distopian part, considering sublimation is seen as so amazing that keeping a whole culture away from proven heaven seems like an angle to take, then the deathism also would be an argument, but more like that they let people die without going to heaven (but again, this is not subtext, culture not going poof is seen as very weird, only question is why a Yud-equiv mind doesn't come back to uplift the physical universe)).

A manipulated population

Not subtext, but simply text. Often criticized in various ways. But also a lot of behavior outside of the norm is tolerated, see the lava boat ride (where the only person not tolerated is the one having a 'this is a simulation' break). Or the guy just building a cable system.

Not mentioned:

consent

(This is also why humans are not pets).

[–] mawhrin@awful.systems 5 points 22 hours ago (1 children)

“let me be really brave and unique: let's imagine culture is how the azadians (or veppers, or the affront, or even the gfcf…) see them. let's ignore the basic fact that this misconception is the main reason why culture's opponents, ultimately, lose. i am very intelligent.”

[–] mawhrin@awful.systems 3 points 22 hours ago

oh, and let's entirely ignore “consider phlebas”…

[–] BlueMonday1984@awful.systems 6 points 1 day ago

New post from tante: The “Data” Narrative eats itself, using the latest Pivot to AI as a jumping off point to talk about synthetic data.

[–] BlueMonday1984@awful.systems 12 points 1 day ago (3 children)

Starting things off with a newsletter by Jared White that caught my attention: Why “Normies” Hate Programmers and the End of the Playful Hacker Trope, which directly discusses how the public perception of programmers has changed for the worse, and how best to rehabilitate it.

Adding my own two cents, the rise of gen-AI has definitely played a role here - I'm gonna quote Baldur Bjarnason directly here, since he said it better than I could:

[–] Soyweiser@awful.systems 7 points 1 day ago* (last edited 1 day ago)

Hackers is dead. (Apologies to punk)

Id say that for one reason alone, when Musk claimed grok was from the guide nobody really turned on him.

Unrelated to programmers or hackers, Elons father (CW: racism) went fully mask off and claims Elon agrees with him. Which considering his promotion of the UK racists does not feel off the mark. (And he is spreading the dumb '[Africans] have an [average] IQ of 63' shit, and claims it is all genetic. Sure man, the average African needs help understanding the business end of a hammer. As I said before, guess I met the smartest Africans in the world then, as my university had a few smart exchange students from an African country. If you look at his statements it is even dumber than normal, as he says population, so that means either non-Black Africans are not included, showing just how much he thinks of himself as the other, or they are, and the Black African average is even lower).

[–] CinnasVerses@awful.systems 10 points 1 day ago* (last edited 1 day ago)

AFAIK the USA is the only country where programmers make very high wages compared to other college-educated people in a profession anyone can enter. Its a myth that so-called STEM majors earn much more than others, although people with a professional degree often launch their careers quicker than people without (but if you really want to launch your career quickly, learn a trade or work in an extractive industry somewhere remote). So I think for a long time programmers in the USA made peace with FAANG because they got a share of the booty.

[–] istewart@awful.systems 13 points 1 day ago (2 children)

This is an interesting crystallization that parallels a lot of thoughts I've been having, and it's particularly hopeful that it seeks to discard the "hacker" moniker and instead specifically describe the subjects as programmers. Looking back, I was only becoming terminally online circa 1997, and back then it seemed like there was an across-the-spectrum effort to reclaim the term "hacker" into a positive connotation after the federal prosecutions of the early 90s. People from aspirant-executive types like Paul Graham to dirty hippies like RMS were insistent that being a "hacker" was a good thing, maybe the best possible thing. This was, of course, a dead letter as soon as Facebook set up at "One Hacker Way" in Menlo Park, but I'd say it's definitely for the best to finally put a solid tombstone on top of that cultural impulse.

As well, because my understanding of the defining activity of the positive-good "hacker" is that it's all too close to Zuckerberg's "move fast and break things," and I think Jared White would probably agree with me. Paul Graham was willing to embrace the term because he was used to the interactive development style of Lisp environments, but the mainstream tools have only fitfully evolved in that direction at best. When "hacking," the "hacker" makes a series of short, small iterations with a mostly nebulous goal in mind, and the bulk of the effort may actually be what's invested in the minimum viable product. The self-conception inherits from geek culture a slumped posture of almost permanent insufficiency, perhaps hiding a Straussian victimhood complex to justify maintaining one's own otherness.

In mentioning Jobs, the piece gestures towards the important cultural distinction that I still think is underexamined. If we're going to reclaim and rehabilitate even homeopathic amounts of Jobs' reputation, the thesis we're trying to get at is that his conception of computers as human tools is directly at odds with the AI promoters' (and, more broadly, most cloud vendors') conception of computers as separate entities. The development of generative AI is only loosely connected with the sanitized smiley-face conception of "hacking." The sheer amount of resources and time spent on training foreclose the possibility of a rapid development loop, and you're still not guaranteed viable output at the end. Your "hacks" can devolve into a complete mess, and at eye-watering expense.

I went and skimmed Graham's Hackers and Painters again to see if I could find any choice quotes along these lines, since he spends that entire essay overdosing on the virtuosity of the "hacker." And hoo boy:

Measuring what hackers are actually trying to do, designing beautiful software, would be much more difficult. You need a good sense of design to judge good design. And there is no correlation, except possibly a negative one, between people's ability to recognize good design and their confidence that they can.

You think Graham will ever realize that we're culminating a generation of his precious "hackers" who ultimately failed at all this?

[–] DonPiano@feddit.org 3 points 3 hours ago (1 children)

Interesting, I'd go rhetorically more in this direction: A hack is not a solution, it's the temporary fix (or.. break?) until you get around to doing it properly. On the axis where hacks are on one end and solutions on the other, genAI shit is beyond the hack. It's not even a temporary fix, its less, functionally and culturally.

[–] Soyweiser@awful.systems 1 points 1 hour ago* (last edited 1 hour ago)

A hack can also just be a clever way to use a system in a way it wasnt designed.

Say you put a Ring doorbell on a drone as a perimeter defense thing? A hack. See also the woman who makes bad robots.

It also can be a certain playfulness with tech. Which is why hacker is dead. It cannot survive contact with capitalist forces.

[–] mirrorwitch@awful.systems 7 points 1 day ago (1 children)

re: last line: no, he never will admit or concede to a single damn thing, and that's why every time I remember this article exists I have to reread dabblers & blowhards one more time purely for defensive catharsis

I don't even know the degree to which that's the fault of the old hackers, though. I think we need to acknowledge the degree to which a CS degree became a good default like an MBA before it, only instead of "business" it was pitched as a ticket to a well-paying job in "computer". I would argue that a large number of those graduates were never going to be particularly interested in the craft of programming beyond what was absolutely necessary to pull a paycheck.