this post was submitted on 24 Aug 2025
22 points (100.0% liked)

TechTakes

2128 readers
96 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

(page 3) 50 comments
sorted by: hot top controversial new old
[–] yellowcake@awful.systems 16 points 2 days ago (1 children)

I bump into a lot of peers/colleagues who are always “ya but what is intelligence” or simply cannot say no to AI. For a while I’ve tried to use the example that if these “AI coding” things are tools, why would I use a tool that’s never perfect? For example I wouldn’t reach for a 10mm wrench that wasn’t 10mm and always rounds off my bolt heads. Of course they have “it could still be useful” responses.

I’m now realizing most programmers haven’t done a manual labor task that’s important. Or lab science outside of maybe high school biology. And the complete lack of ability to put oneself in the shoes of another makes my rebuttals fall flat. To them everything is a nail and anything could be a hammer if it gets them paid to say so. Moving fast and breaking things works everywhere always.

For something not just venting I tasked a coworker with some runtime memory relocation and Gemini had this to say about ASLR: Age, Sex, Location Randomization

[–] BlueMonday1984@awful.systems 7 points 2 days ago (1 children)

I’m now realizing most programmers haven’t done a manual labor task that’s important. Or lab science outside of maybe high school biology. And the complete lack of ability to put oneself in the shoes of another makes my rebuttals fall flat. To them everything is a nail and anything could be a hammer if it gets them paid to say so. Moving fast and breaking things works everywhere always.

On a semi-related sidenote, part of me feels that the AI bubble has turned programming into a bit of a cultural punchline.

On one front, the stench of Eau de Tech Asshole that AI creates has definitely rubbed off on the field, and all the programmers who worked at OpenAI et al. have likely painted it as complicit in the bubble's harms.

On another front, the tech industry's relentless hype around AI, combined with its myriad failures (both comical and nightmarish) have cast significant doubt on the judgment of tech as a whole (which has rubbed off on programming as well) - for issues of artistic judgment specifically, the slop-nami's given people an easy way to dismiss their statements out of hand.

[–] froztbyte@awful.systems 12 points 2 days ago

sidenote

you have so many of these! it's amazing! are you going to publish soon? it seems like it might need a whole guide of its own!

moderately barbed jesting aside, a serious question: have you spoken with any programmers/artists/researchers/.... ? so many of your comments have "part of me feels" parts hitting pop-concern-direction things and, like, I get it, but. have you spoken with any of them? what were those conversations like? what did you take away from them? what stuck with you that you want to share?

[–] fnix@awful.systems 13 points 2 days ago (1 children)

More of a pet peeve than a primal scream, but I wonder what's with Adam Tooze and his awe of AI. Tooze is a left-wing economic historian who’s generally interesting to listen to (though perhaps in tackling a very wide range of subject matter sometimes missing some depth), but nevertheless seems as AI-pilled as any VC. Most recently came about this bit: Berlin Forum on Global Cooperation 2025 - Keynote Adam Tooze

Anyone who’s used AI seriously knows the LLMs are extraordinary in what they’re able to do ... 5 years down the line, this will be even more transformative.

Really, anyone Adam? Are you sure about the techbro pitch there?

[–] gerikson@awful.systems 9 points 2 days ago (1 children)

Sad if true. I really enjoyed his book The Wages of Destruction which mythbusts a lot of folk knowledge about the Nazis

https://gerikson.com/blog/books/read/The-Wages-of-Destruction.html

[–] saucerwizard@awful.systems 5 points 2 days ago

I literally just got the audiobook.

[–] gerikson@awful.systems 10 points 2 days ago (8 children)

"Enjoy" this Wronger explaining human sexual attraction

https://www.lesswrong.com/posts/ktydLowvEg8NxaG4Z/neuroscience-of-human-sexual-attraction-triggers-3

I have but skimmed it, not plumbed its depths for sneers.

[–] fullsquare@awful.systems 10 points 2 days ago (1 children)

it's just this with more words (context: someone dead serious tweeted that speedrunning is communism and brought into it peterson's mouth noises on sex somehow, and all in 14 tweets)

[–] Soyweiser@awful.systems 5 points 2 days ago* (last edited 2 days ago)

From the other reactions, dont have the energy to read it atm (it was this or orcas), looks like he is recreating heartiste from first principles.

[–] sailor_sega_saturn@awful.systems 14 points 2 days ago (1 children)

In this exciting new research direction in the making-stuff-up field I build upon previous work by Myself et. al in the making-stuff-up field.

[–] sailor_sega_saturn@awful.systems 12 points 2 days ago* (last edited 2 days ago) (3 children)

Ugh reading more of this and it's awful.

He writes that women are attracted to men who could beat us up or control us. He writes that the reason for this attraction is so we have a chance to marry the man and prevent these bad things from happening.

His "science" assumes that women think like they do in shitty erotica written by men for men. Even by rationalist evo-psych standards this is pretty poorly thought out.

And yet, per Steven Pinker, “a middle-aged congresswoman does not radiate the same animal magnetism to the opposite sex that a middle-aged congressman does”. What’s the deal?

OK other straight ladies here, raise your hand if you've ever felt that middle aged congressmen, as a whole, "radiate animal magnetism". Anyone? Anyone?

[–] YourNetworkIsHaunted@awful.systems 10 points 2 days ago (1 children)

Having now read it (I have regrets), I think it's even worse than you suggested. He's not trying to argue that women are attracted to dangerous men in order to prevent the danger from happening to them. He assumes that, based on "everyday experience" of how he feels when dealing with "high-status" men and then tries to use that as an extension of and evidence for his base-level theory of how the brain does consciousness. (I'm not going to make the obvious joke about alternative reasons why he has the same feeling around certain men that he does around women he finds attractive.) In order to get there he has to assume that culture and learning play no role in what people find attractive, which is just absurd on it's face and renders the whole argument not worth engaging with.

[–] zogwarg@awful.systems 7 points 2 days ago* (last edited 2 days ago)

It's almost endearing (or sad) that he believes (or very strongly wants to believe) his experience is "typical", exploring the boundaries of what you are attracted to typically doesn't involve this much evo-pysch psychobabble, or even this much fragile masculinity.

I feel like this is some friggin' Kissinger "power is an aphrodisiac" nonsense. Which is hilarious because while yes Kissinger spent more time out on the town with beautiful women than you would expect for a Ben Stein-esque war criminal, when journalists at the time talked to those women they pretty consistently said that they enjoyed feeling like he respected them and wanted to talk about the world and listened to what they had to say. But that would be anathema to Rationalism, I guess.

[–] blakestacey@awful.systems 13 points 2 days ago* (last edited 2 days ago) (2 children)

Imagr description: Steven Pinker, Lawrence Krauss and Jeffrey Epstein, posted as per tradition when either of the latter two are mentioned

load more comments (2 replies)
[–] gerikson@awful.systems 6 points 2 days ago (4 children)
[–] zogwarg@awful.systems 5 points 2 days ago

Are they drawn to the cult because they are obsessed with status, or does the cult foster this obssession? Yes.

load more comments (3 replies)
[–] istewart@awful.systems 10 points 2 days ago (4 children)

Wronger explaining human sexual attraction

Are we sure this isn't an SCP entry?

load more comments (4 replies)
load more comments (3 replies)
[–] BlueMonday1984@awful.systems 9 points 2 days ago
[–] blakestacey@awful.systems 11 points 2 days ago

James Gleick on "The Lie of AI":

https://around.com/the-lie-of-ai/

Nothing new for regulars here, I suspect, but it might be useful to have in one's pocket.

[–] BlueMonday1984@awful.systems 10 points 2 days ago (2 children)

New Ed Zitron: "How to Argue With An AI Booster", an hour-long read dedicated to exactly what it says on the tin.

[–] scruiser@awful.systems 7 points 2 days ago

It's a nice master post that gets all his responses and many useful articles linked into one place. It's all familiar if you've kept up with techtakes and Zitron's other posts and pivot-to-ai, but I found a few articles I had previously missed reading.

Related trend to all the but achskhually's AI booster's like to throw out. Has everyone else noticed the trend where someone makes a claim of a rumor they heard about an LLM making a genuine discovery in some science, except it's always repeated second hand so you can't really evaluate it, and in the rare cases they do have a link to the source, it's always much less impressive than they made it sound at first...

[–] froztbyte@awful.systems 8 points 2 days ago (3 children)

I'm curious, do you get paid for being a multiprotocol rss repeater?

[–] YourNetworkIsHaunted@awful.systems 12 points 2 days ago (2 children)

I don't know if they do but as someone too lazy to actually set up an RSS feed I deeply appreciate it.

[–] Soyweiser@awful.systems 9 points 2 days ago

I appreciate it, and this also gives us an easy way to discuss it as Zitron seems to be quite popular here. So makes sense to me to just also post it here. And not everybody uses RSS (or Eds one).

load more comments (1 replies)
[–] BlueMonday1984@awful.systems 10 points 2 days ago (1 children)

No, I do this for the love of the game

load more comments (1 replies)
[–] scruiser@awful.systems 6 points 2 days ago

Even for the people that do get email notifications of Zitron's excellent content (like myself), I appreciate having a place here to discuss it.

[–] blakestacey@awful.systems 18 points 3 days ago (3 children)

From the comments:

Finally, I dislike the arrogant, brash, confident, tone of many posts on LessWrong.

Hmm, OK. Where might this be going?

Plausibly, I think a lot of this is inherited from Eliezer, who is used to communicating complex ideas to people less intelligent and/or rational than he is. This is not the experience of a typical poster on LessWrong, and I think it's maladaptive for people to use Eliezer's style and epistemic confidence in their own writings and thinking.

[–] dgerard@awful.systems 9 points 2 days ago (1 children)

yes, instead they use Scott's and just keep typing forever

[–] blakestacey@awful.systems 9 points 2 days ago (1 children)

"Which Scott?"

"Any of them."

load more comments (1 replies)
load more comments (2 replies)
load more comments
view more: ‹ prev next ›