this post was submitted on 31 Aug 2025
12 points (100.0% liked)

TechTakes

2147 readers
232 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

top 50 comments
sorted by: hot top controversial new old
[–] rook@awful.systems 1 points 16 minutes ago

One to watch from a safe distance: dafdef, an “ai browser” aimed at founders and “UCG creators”, named using the traditional amazon-keysmash naming technique and and following the ai-companies-must-have-a-logo-suggestive-of-an-anus style guide.

Dafdef learns your browsing patterns and suggests what you'd do next After watching you fill out similar forms a few times, Dafdef starts autocompleting them. Apply with your startup to YC, HF0 and A16z without wasting your time.

So… spicy autocomplete.

But that’s not all! Tired of your chatbot being unable to control everything on your iphone, due to irksome security features implemented by those control freaks at apple? There’s a way around that!

Introducing the “ai key”!

A tiny USB-C key that turns your phone into a trusted AI assistant. It sees your screen, acts on your behalf, and remembers — all while staying under your control.

I’m sure you can absolutely trust an ai browser connected to a tool that has nearly full control over your phone to not do anything bad, because prompt injection isn’t a thing, right?

(I say nearly full, because I think Apple Pay requires physical interaction with a phone button or face id, but if dafdef can automate the boring and repetitive parts of using your banking app then having full control of the phone might not matter)

h/t to ian coldwater

[–] zogwarg@awful.systems 1 points 22 minutes ago

An interesting talk on the impact of the impact of AI slop bug bounty submission on the curl project (youtube).

[–] gerikson@awful.systems 6 points 9 hours ago
[–] dgerard@awful.systems 5 points 12 hours ago (3 children)

I should get a green screen for my videos. Suggestions welcomed.

  • I can't leave it up permanently, I need to be able to get it out and put it away each use without it being a massive PITA.
  • I record in a small room. It's tiny and awkwardly shaped and I'm fixed in place by perspective tricks (something I would like the green screen to alleviate).
  • Amazon is the least worst vendor in this circumstance if all else is equal.

Any ideas? Do any of you use such a device yourselves?

[–] swlabr@awful.systems 7 points 7 hours ago (1 children)

Mount camera to ceiling, get a green rug, record lying on the ground (jk)

[–] jonhendry@iosdev.space 4 points 7 hours ago

@swlabr

While lying on a lazy susan and rotating slowly, with Major Tom playing faintly in the background.

[–] swlabr@awful.systems 1 points 5 hours ago* (last edited 5 hours ago)

Actual suggestion: maybe mount a roller blind to the ceiling. Might need extra weight to tension the blind but also might not.

Also, you probably don’t need a totally green background, just enough to make masking easier. You could conceivably tape a big sheet of green cardboard to the back of your chair and see how that goes.

Also: idk how hard or easy this is but if the camera position is fixed and the rest of the scene is static, maybe there’s software that can just mask out the static scene?

[–] BlueMonday1984@awful.systems 4 points 10 hours ago

I know Elgato do a collapsible greenscreen, but that's the only one coming to mind.

[–] mirrorwitch@awful.systems 9 points 1 day ago

So I learned about the rise of pro-Clippy sentiment in the wake of ChatGPT and that led me on a little ramble about the ELIZA effect vs. the exercise of empathy https://awful.systems/post/5495333

[–] Soyweiser@awful.systems 5 points 1 day ago* (last edited 20 hours ago) (3 children)

Not worthy of a third post imho, but Scott made the trilogy post: https://scottaaronson.blog/?p=9108 where he is subtlety walking back his insane, proudly kid killing claim to the more reasonable, Israel needs to defeat hamas, claim.

For a man angry about his detractors being intellectually dishonest(**), this is very typical:

Incredibly, commenters on Peter Woit’s blog then blamed me for this antisemitic image, mistakenly imagining that I’d created it myself, and then used their false assumption as further proof of my mental illness

For context, it was an AI slop image, and he named it after Woit, and didn't provide any details from the email itself. While the whole affair is quite a good reason to at least put this persons name/email out there imho.

But also:

I’m even grateful, in a way, to SneerClub, and to Woit and his minions. I’m grateful to them for so dramatically confirming that I’m not delusional: some portion of the world really is out to get me. I probably overestimated their power, but not their malevolence. [...]

His ability to lack self-awareness and reading skills remains bad. The power thing was partially in his first blogpost already, where he realized the 'manhaters' were just a small group. And the latter makes no sense unless you hold the random cartoon guy (who nobody here or at Woit expressed support for) as the consensus, and actual sc people, or fans of Peter.

The last part is just nuts, whole bit of Rationalization why he is actually justified, and will change the world. Not realizing everyone was horrified because he created a thought experiment to justify a genocide:

Reading the SneerClubbers’ armchair diagnoses of my severe mental illness, paranoia, persecution complex, grandiosity, etc. etc. I had the following thought, paraphrasing Shaw:

Yes, they’re absolutely right that psychologically well-adjusted people generally do figure out how to adapt themselves to the reigning morality of their social environment—as indicated by the Asch conformity test, the Milgram electric-shock experiment, and the other classics of social psychology.

It takes someone psychologically troubled, in one way or another, to persist in trying to adapt the reigning morality of their social environment to themselves.

If so, however, this suggests that all the moral progress of humanity depends on psychologically troubled people—a realization for which I’m deeply grateful.

this bit opens up so much questions, remarks and is just silly in some ways. Yes people who are dissatisfied will push for change, why is this a revelation? But do they have the power? Is it justified? Jared Taylor(*) also pushes for a changed morality system, but I wouldn't consider that desirable. It also leaves out that people will push for change because they just want to profit from it. Which is likely a lot bigger driver of change see the libertarian ls pushing for less regulations, because the dumb and distracted deserve to be scammed.

Anyway we got called out twice!

*: a white nationalist piece of shit who from what I heard is notable because compared to his peers he isnt a raging anti-Semite. At least not openly. **: Edit: as nobody mentioned the whole 'Peter is intellectually dishonest' affair, I have to say one thing. Peter is imho not intellectually dishonest for leaving out that the people who tie their children on the traintrack could stop at any time. The people who don't mention this just don't think it is a convincing argument. Yes people who take hostages could release the hostages at any time, still no reason to shoot through the hostages. Derail the trolley! The way Russia dealt with the Opera hostage crisis was considered bad for a reason (I hope I don't have to point out that the hostage takers also were in the wrong here). Forgot who it was who mentioned it here, but the whole 'I would be glad to not flip the switch' is one of the fucked up parts, it would scar people for life to make that choice.

Another edit: While thinking I realized the whole disgusting email thing is also partially due to the asshole filter effect of Scott closing his comments (which was smart tbh, he just should have told the people emailing him to fuck off): https://siderea.dreamwidth.org/1209794.html. Left a hopefully thoughtful comment (I looked at it just from the comments closing aspect and not with the further conflict in mind, as I have not begun to think about that, and don't want to think about it in context of what Scott wrote in his first blog post) about this on Peters blog. Not sure if he will let it through the moderation (and it is fine if he doesn't, and I doubt he will as he seemingly just closed the comments and went on vacation, have a good one Peter, apologies that your blog became a battleground over this). But I thought the concept was important enough as an idea to share, and also explains why you get the more shitty people to try and react when you go 'please don't react to here'. A thing we should also be aware of how the drive by commenters (which was worse on reddit) tend to be the sediment of the crop.

[–] Architeuthis@awful.systems 10 points 20 hours ago* (last edited 20 hours ago)

Shamelessly reproduced from the other place:

A quick summary of his last three posts:

"Here's a thought experiment I came up with to try to justify the murder of tens of thousands of children."

"Lots of people got mad at me for my last post; have you considered that being mad at me makes me the victim and you a Nazi?"

"I'm actually winning so much right now: it's very normal that people keep worriedly speculating that I've suffered some sort of mental breakdown."

[–] Soyweiser@awful.systems 6 points 20 hours ago* (last edited 19 hours ago) (1 children)

Gonna repost a bit from comment I left on reddit:

Read swlabrs reply first btw.

Anyway, something useful perhaps, the ~~World Central Kitchen~~ seems to help out with the famine so donating to them might be useful, see comment below. Donate to the UNRWA organisation instead. Not comfortable donating on a link provided by a random user (smart!), or want something to show for your donation, the current Humble book bundle is donating to the WCK (do not forget to adjust the sliders) and you get a nice collection of Martha Wells books (The Murderbot saga is great). Or buy the play for peace bundle which is donating to the UNRWA USA, and get a shitton of games and other stuff. (All links are affiliate free from my end, I know there is a system setup for Humble stuff, but I don't use that).

E: buying the last one is also a fun way to boost yourself into Spiders Georg levels of video game ownership. So you could do it for that joke alone. Just to claim you own ~300 videogames.

[–] swlabr@awful.systems 5 points 19 hours ago* (last edited 19 hours ago) (1 children)

I’ve followed Jose Andres on insta for about 6 years now, and while I’d love to only have nice things to say about him and WCK, I know that there is resentment/resistance to the actions of WCK, for good reason. This is the first thing that pops up, a page detailing how WCK works with the IDF and how that is not in the best interests of Palestine. What’s also really gross, ghoulish and troublesome is that, despite the fact that Israel has literally killed WCK volunteers, Andres still works with them, and is largely pro-israel. That being said, everything about this is fucked anyway and I imagine that if you’re donating, your heart is in the right place. But, uh, yeah, adjust those sliders.

E: ok I have fully read the page. Fuck Andres, fuck WCK, please do not donate to them. Donate to UNRWA instead, or any organisation that is actually willing to call what Israel is doing a genocide.

[–] Soyweiser@awful.systems 5 points 19 hours ago* (last edited 19 hours ago)

Thanks for the info, will link this comment on reddit as well. Edited both posts, and yeah I just knew of both book/game projects and was going via that, used the humble link first that is why WCK came out on top and not UNRWA.

[–] Architeuthis@awful.systems 9 points 23 hours ago (2 children)

I’m even grateful, in a way, to SneerClub, and to Woit and his minions. I’m grateful to them for so dramatically confirming that I’m not delusional: some portion of the world really is out to get me. I probably overestimated their power, but not their malevolence. […]

Honestly what he should actually be grateful for is how all his notoriety ever amounted to^[1]^ was a couple of obscure forums going 'look at this dumb asshole' and moving on.

He is an insecure and toxic serial overreactor with shit opinions and a huge unpopular-young-nerd chip on his shoulder, and who comes off as being one mildly concerted troll effort away from a psych ward at all times. And probably not even that, judging from Graham Linehan's life trajectory.

[1] besides Siskind using him to broaden his influence on incels and gamer gaters.

[–] o7___o7@awful.systems 7 points 21 hours ago* (last edited 21 hours ago)

Unmonitored RSD is a real sonuvabitch

[–] Soyweiser@awful.systems 6 points 23 hours ago

While this is close to 'look at what you made me do territory'. We would sneer a lot less at him if he didn't blame us for everything. 'people who sneer made covid worse' (not the direct quote) for example was just silly, and if you look at the reaction of sneerclub at the time also not in the realm of reality. (But yes he will just say he said sneer by which he didn't mean sneerclub but people like us in general. Which is obv not a thing I fully agree with, but good motte/bailey).

It also is interesting, as Scott compared to the others we sneer at never really seems to break containment so to speak. I have seen people talk about Aella on bsky for example, Scott Alexander, Eliezer, lesswrong, EA, etc all come up. But Scott almost never does. (Yes, the 'untitled' affair was public, but that was a decade ago, and 6 months before r/sneerclub was created (and long before I joined) and also the whole broaden influence thing as you mentioned). And after all why should he, he is just a random professor, the only reason he is relevant for the broader picture is that he agrees with the AI doom stuff, and he gives the LW people some level of prestige (the only times I have brought him up is because he confirmed that Yarvin spends time personally emailing prestigious people like him but that is about Moldy). He is prob the only one whos sneering is just contained to sneerclub/awful.systems (which is why he should stop reading sc, he prob should also ignore more blog comments and emails (block Yarvin's email Scott, do it!)).

[–] BlueMonday1984@awful.systems 2 points 22 hours ago (3 children)

"For those in writing related jobs, they may find lucrative work cleaning up attempts to sidestep them with AI slop, squeezing hefty premiums from desperate clients who find themselves lacking leverage over them.

Me, 17 days ago

Well, seems I was pretty close - NBC news recently reported humans are being hired to clean up AI slop. My prediction it would be lucrative was off the mark, though - artists called in for de-slopping work are getting paid less than if they were simply hired to create the work themselves. Clearly, I was being overly optimistic.

You want my take, anyone who gets hired for slop cleanup should try to squeeze as much cash out their clients as much as possible - they showed open contempt for humanity by choosing a clanker, they need to be shown the consequences.

[–] scruiser@awful.systems 10 points 9 hours ago

Chiming in to agree your prediction write-ups aren't particularly good. Sure they spark discussion, but the whole forecasting/prediction game is one we've seen the rationalists play many times, and it is very easy to overlook or at least undercount your misses and over hype your successes.

In general... I think your predictions are too specific and too optimistic...

[–] self@awful.systems 11 points 11 hours ago (1 children)

@froztbyte@awful.systems is coming in a bit hot, but they’re not fundamentally wrong. @BlueMonday1984@awful.systems, you’ve received some pointed but good feedback on your writing style recently in MoreWrite and in other threads; I think you’ll be able to make much the same points more persuasively if you incorporate some of that feedback. I also recommend breaking out of doing predictions; I think you might get more mileage and variation out of a journalistic approach, and either way we know from crypto that prediction is a bit of a fool’s errand. these fuckers will remain irrational long after we’ve gone bankrupt, etc.

I’m not gonna force you to do anything; if posts like these are what you’re comfortable with then it is what it is. I just feel like if you break out of your current pattern, you’re capable of achieving great results.

[–] blakestacey@awful.systems 8 points 9 hours ago

I was going to chime in to say something similar. I don't think trying to game out the possible reaction to the possible hype about the possible application, etc., etc., is the best use of anyone's time. It might be more beneficial to, for example, keep track of the cases where the guys selling "quantum" are the same guys who have been selling "AI" and "crypto".

[–] froztbyte@awful.systems 2 points 22 hours ago (1 children)

Me, 17 days ago

this really isn't a hard guess

You want my take

probably not

anyone who gets hired for slop cleanup should try to squeeze as much cash out their clients as much as possible

"people should try to get paid well"? that's your whole take? really? you thought this was worthwhile posting? not, maybe, spitballing ideas for how people should get paid well? some advice on how to negotiate with clients who are quite likely to be pennypinching types (evidenced by them trying to get as much as possible for free)? none of that, just more fluff? okay then

[–] deathgrindfreak@awful.systems 0 points 12 hours ago (1 children)

You're kind of a dick, maybe you're in the wrong instance?

[–] self@awful.systems 5 points 12 hours ago

who the fuck are you again? what in the fucking world made you think your 14 comments and 1 post meant you got to decide who gets to have an account here?

[–] scruiser@awful.systems 11 points 1 day ago (2 children)

Lesswronger notices all of the rationalist's attempts at making an "aligned" AI company keep failing: https://www.lesswrong.com/posts/PBd7xPAh22y66rbme/anthropic-s-leading-researchers-acted-as-moderate

Notably, the author doesn't realize Capitalism is the root problem in misaligning the incentives, and it takes a comment directly point it out for them to get as far as noticing as link to the cycle of enshittification.

[–] swlabr@awful.systems 19 points 1 day ago (1 children)
>50 min read  
>”why company has perverse incentives”
>no mention of capitalism

rationalism.mpeg

[–] scruiser@awful.systems 5 points 9 hours ago (1 children)

Every time I see a rationalist bring up the term "Moloch" I get a little angrier at Scott Alexander.

[–] swlabr@awful.systems 2 points 8 hours ago* (last edited 7 hours ago)

“Moloch”, huh? What are we living in, some kind of demon-haunted world?

Others were alarmed and advocated internally against scaling large language models. But these were not AGI safety researchers, but critical AI researchers, like Dr. Timnit Gebru.

Here we see rationalists approaching dangerously close to self-awareness and recognizing their whole concept of "AI safety" as marketing copy.

[–] BigMuffN69@awful.systems 8 points 1 day ago* (last edited 1 day ago) (2 children)

Great piece on previous hype waves by P. Ball

https://aeon.co/essays/no-suffering-no-death-no-limits-the-nanobots-pipe-dream

It’s sad, my “thoroughly researched” “paper” greygoo-2027 just doesn’t seem to have that viral x-factor that lands me exclusive interviews w/ the Times 🫠

[–] scruiser@awful.systems 9 points 1 day ago (2 children)

Putting this into the current context of LLMs... Given how Eliezer still repeats the "diamondoid bacteria" line in his AI-doom scenarios, even multiple decades after Drexler has both been thoroughly debunked and slightly contributed to inspiring real science, I bet memes of LLM-AGI doom and utopia will last long after the LLM bubble pops.

[–] dgerard@awful.systems 4 points 12 hours ago (1 children)

did he actually contribute anything other than chemists nicking the name "nanotechnology"?

[–] scruiser@awful.systems 1 points 9 hours ago

I use the term "inspiring" loosely.

[–] Soyweiser@awful.systems 6 points 1 day ago

Eliezer came from the extropian newsgroups/mailinglists iirc. So it is quite connected.

[–] Soyweiser@awful.systems 7 points 1 day ago

Indeed great piece, good to document the older history of that stuff as well.

[–] bitofhope@awful.systems 8 points 1 day ago (1 children)

Creator of NaCl publishes something even saltier.

"Am I being detained?" I scream as IETF politely asks me to stop throwing a tantrum over the concept of having moderation policy.

[–] BasiqueEvangelist@awful.systems 5 points 1 day ago (2 children)

Does somebody have a rundown or something on DJB? All of the tantrum throwing has me confused over what his deal is.

[–] corbin@awful.systems 5 points 15 hours ago

Sibling comment is important recent stuff. Historically, the most important tantrum he's thrown is DJB v USA in 1995, where he insisted that folks in the USA have a First Amendment right to publish source code. He also threw a joint tantrum with two other cryptographers over the Dual EC DRBG scandal after Snowden revealed its existence in 2013. He's scored real wins against the USA for us, which is why his inability to be polite is often tolerated.

[–] froztbyte@awful.systems 6 points 1 day ago (1 children)

noted for advancements in cryptography, and “stayed impartial” (iirc not quite defending, but also not acknowledging nor distancing) when the jacob appelbaum shit hit wider knowledge

probably about all you need to know in a nutshell

the most recent shit before this when I recall seeing his name pop up was when he was causing slapfight around Kyber (ML-KEM) in the cryptography spaces, but I don’t have links at hand

[–] mawhrin@awful.systems 7 points 14 hours ago (1 children)

there's a bit more; tanja lange, a notable cryptographer herself and djb's partner was appelbaum's phd advisor (and didn't drop him after he got outed as a rapist).

[–] froztbyte@awful.systems 2 points 5 hours ago

heh fuck you’re right, I forgot about that

[–] fullsquare@awful.systems 7 points 1 day ago (2 children)
[–] dgerard@awful.systems 5 points 12 hours ago (1 children)

really has turned into Scientology

[–] fullsquare@awful.systems 3 points 7 hours ago

scientology but less coherent and with worse seafaring

[–] froztbyte@awful.systems 8 points 1 day ago (1 children)

ah yes, that great mark of certainty and product security, when you have to unleash pitbulls to patrol the completely not dangerous park that everyone can totally feel at ease in

(and of course I bet the damn play is a resource exhaustion attack on critics, isn’t it)

[–] YourNetworkIsHaunted@awful.systems 5 points 1 day ago (1 children)

I don't think it's a resource exhaustion attack as much as a combination of legitimate paranoia (the consequence of a worldview where only billionaires are capable of actual agency) and attempt to impose that on reality by reverse-astroturfing any opposition by tying it to other billionaire AI bros.

[–] froztbyte@awful.systems 1 points 6 hours ago

heh you may be right, they certainly do appear to be that flavour of delusional

[–] BlueMonday1984@awful.systems 6 points 1 day ago (2 children)

New Baldur Bjarnason: The melancholy of history rhyming, comparing the AI bubble with the Icelandic banking bubble, and talking about the impending fallout of its burst.

load more comments (2 replies)
load more comments
view more: next ›