This is conceptually different, it just generates a few seconds of doomlike video that you can slightly influence by sending inputs, and pretends that In The Future™ entire games could be generated from scratch and playable on Sufficiently Advanced™ autocomplete machines.
The whole article is sneertastic. Nothing to add, will be sharing.
What you’re dealing with here is a cult. These tech billionaires are building a religion. They believe they’re creating something with AI that’s going to be the most powerful thing that’s ever existed — this omniscient, all-knowing God-like entity — and they see themselves as the prophets of that future.
eugenic TESCREAL screed (an acronym for … oh, never mind).
“Immortality is a key part of this belief system. In that way, it’s very much like a religion. That’s why some people are calling it the Scientology of Silicon Valley.”
Others in San Francisco are calling it “The Nerd Reich.”
“I think these guys see Trump as an empty vessel,” says the well-known exec who’s supporting Harris. “They see him as a way to pursue their political agenda, which is survival of the fittest, no regulation, burn-the-house-down nihilism that lacks any empathy or nuance.”
There's also the Julian Assange connection, so we can probably blame him for Trump being president as well.
I like how you lose faith in your argument the longer your post goes on. Maybe start with the last sentence next time.
To have a dead simple UI where you, a person with no technical expertise, can ask in plain language for the data you want in the way you want them presented, along with some basic analysis that you can tell it to make it sound important. Then you tell it to turn it into an email in the style of your previous emails, send it, and take a 50min coffee break. All this allegedly with no overhead besides paying a subscription and telling your IT people to point the thing to the thing.
I mean, it would be quite something if transformers could do all that, instead of raising global temperatures to synthesize convincing looking but highly suspect messaging at best while being prone to delirium at worst.
"Manifest is open minded about eugenics and securing the existence of our people and a future for high IQ children."
Promptfondler proudly messes with oss project (OpenAI subreddit)
To be clear nothing in the post makes me think they actually did what they are claiming, from doing non-specific 'fixes' to never explicitly saying what the project is other that that it is 'major' and 'used by many' to the explicit '{next product iteration} is gonna be so incredible you guys' tone of the post, it's just the thought of random LLM enthusiasts deciding en masse to play programmer on existing oss projects that makes my hairs stand on end.
Here they are explaining their process
It's code reading and copy pasta.
given the traffic patterns of our threads
Highlighting the new posts since the last time you visited a thread would be amazing if possible.
Yeah, a lot of these TESCREAL exposés seem to lean on the perceived quirkiness while completely failing to convey how deeply unserious their purported scientific and philosophical footing is, like virgin tzatziki with impossible gyros unserious.
Absolutely, you can't keep pandering to the so called anti-woke and not end up with a lot of incel-adjacent people in your spaces, and the eugenics undercurrent feeds directly into manosphere perceptions about optimizing dating and tying your self worth to your splachnocranium/neurocranium ratio.
More specifically Scott Alexander has pandered pretty aggressively to the Dogged Good Guy demographic, and is also on the hook for being all about the 'merits' of neoreaction, and people like Moldbug and Emil Kirkegaard are semi-regulars in his comment sections.
Also worth noting that before the infamous EY editorial in TIME that called for airstrikes against foreign datacenters to prevent clippy from going rogue, the previous time they covered ea/rat was to report that they appear to have a serious sexual exploitation problem.
On a more speculative note, some staples of the movement like effective polyamory may have come about directly from early rationalist inability to get any on the regular. Apparently if you go reddit spelunking it appears they also went through a phase of trying to ~~brainwash each other~~ optimize into bisexuality to stave off sexual frustration.
To the extent EA/rats perpetuate cult behavior, it's probably safe to say that neither EY nor any other high status individuals within the space are wanting for sex.
The job site decided to recommend me an article calling for the removal of most human oversight from military AI on grounds of inefficiency, which is a pressing issue since apparently we're already living in the Culture.
The Strategic Liability of Human Oversight in AI-Driven Military Operations
~~Oh unknowable genie of the sketchily curated datasets~~ Claude, come up with an optimal ratio of civilian to enemy combatant deaths that will allow us to bomb that building with the giant red cross that you labeled an enemy stronghold.