[-] mindbleach@lemmy.world 56 points 1 year ago

This has made a lot of people very angry and been widely regarded as a bad move.

Seriously though, this is the first properly good UI for a desktop computer. Mac OS (or I guess Macintosh OS at the time) was okay, but reliant on the global menu and weird drop-downs. Windows kept everything self-contained. Even multi-window programs tended to use the "multiple document interface," i.e., windows inside windows. Tabs weren't really a thing yet.

It also crashed if you looked at it funny and had the antivirus capabilities of warm cheese. But there's damn good reasons Windows 7 was the same experience, extended, rather than replaced. It's more-or-less what I style Linux to look like. And in light of that I'm kinda pissed off any OS ever struggles to remain responsive, when this relic ran smoothly on one stick of RAM that's smaller than my CPU's cache.

[-] mindbleach@lemmy.world 166 points 1 year ago

Oh man, y'all are in for an extremely 2014 riff on the entire movie.

Am I Gay? A journey of self-discovery with Shang.

There was a version with each image separate but Imgur deleted it because Imgur is a hollow shell of itself after whoring out yet another community for money in a naked act of stage-two enshittification.

[-] mindbleach@lemmy.world 76 points 1 year ago

I don't care. Any outcome where he's walking free is a failure. I want to stop hearing about this miserable idiot.

[-] mindbleach@lemmy.world 68 points 1 year ago

Charitable assumption.

It probably broke.

[-] mindbleach@lemmy.world 86 points 1 year ago

Famously compressible white noise.

[-] mindbleach@lemmy.world 70 points 1 year ago

Suspension of disbelief doesn't mean Harry Potter gets a lightsaber. There are rules.

[-] mindbleach@lemmy.world 73 points 1 year ago

What the fuck did Elon buy, at this point?

He fired the employees.

He threw out the code.

He yanked plugs on the physical servers.

He forgot to pay for the virtual servers.

He started rent protest for the office space.

He deleted the brand the way Malcolm X deleted his surname.

If he'd just started a Twitter competitor, with blackjack and doxxing, the only difference would be that Twitter was a bit quieter.

If he'd bought Twitter, the hellsite, and then burned it to the ground as a weird flex, the only difference would be slightly more people using Mastodon.

And in both cases nobody would know he's a complete crybaby. We'd just harbor strong suspicions.

[-] mindbleach@lemmy.world 69 points 1 year ago

"At will" is pure conservative bullshit, right down to the name being a lie that makes "fired for illegal reasons but wink wink doesn't count" harmful to even discuss.

No relation to Musk's stupidity. Just picking a nit.

[-] mindbleach@lemmy.world 80 points 1 year ago

The PS3 had a 128-bit CPU. Sort of. "Altivec" vector processing could split each 128-bit word into several values and operate on them simultaneously. So for example if you wanted to do 3D transformations using 32-bit numbers, you could do four of them at once, as easily as one. It doesn't make doing one any faster.

Vector processing is present in nearly every modern CPU, though. Intel's had it since the late 90s with MMX and SSE. Those just had to load registers 32 bits at a time before performing each same-instrunction-multiple-data operation.

The benefit of increasing bit depth is that you can move that data in parallel.

The downside of increasing bit depth is that you have to move that data in parallel.

To move a 32-bit number between places in a single clock cycle, you need 32 wires between two places. And you need them between any two places that will directly move a number. Routing all those wires takes up precious space inside a microchip. Indirect movement can simplify that diagram, but then each step requires a separate clock cycle. Which is fine - this is a tradeoff every CPU has made for thirty-plus years, as "pipelining." Instead of doing a whole operation all-at-once, or holding back the program while each instruction is being cranked out over several cycles, instructions get broken down into stages according to which internal components they need. The processor becomes a chain of steps: decode instruction, fetch data, do math, write result. CPUs can often "retire" one instruction per cycle, even if instructions take many cycles from beginning to end.

To move a 128-bit number between places in a single clock cycle, you need an obscene amount of space. Each lane is four times as wide and still has to go between all the same places. This is why 1990s consoles and graphics cards might advertise 256-bit interconnects between specific components, even for mundane 32-bit machines. They were speeding up one particular spot where a whole bunch of data went a very short distance between a few specific places.

Modern video cards no doubt have similar shortcuts, but that's no longer the primary way the perform ridiculous quantities of work. Mostly they wait.

CPUs are linear. CPU design has sunk eleventeen hojillion dollars into getting instructions into and out of the processor, as soon as possible. They'll pre-emptively read from slow memory into layers of progressively faster memory deeper inside the microchip. Having to fetch some random address means delaying things for agonizing microseconds with nothing to do. That focus on straight-line speed was synonymous with performance, long after clock rates hit the gigahertz barrier. There's this Computer Science 101 concept called Amdahl's Law that was taught wrong as a result of this - people insisted 'more processors won't work faster,' when what it said was, 'more processors do more work.'

Video cards wait better. They have wide lanes where they can afford to, especially in one fat pipe to the processor, but to my knowledge they're fairly conservative on the inside. They don't have hideously-complex processors with layers of exotic cache memory. If they need something that'll take an entire millionth of a second to go fetch, they'll start that, and then do something else. When another task stalls, they'll get back to the other one, and hey look the fetch completed. 3D rendering is fast because it barely matters what order things happen in. Each pixel tends to be independent, at least within groups of a couple hundred to a couple million, for any part of a scene. So instead of one ultra-wide high-speed data-shredder, ready to handle one continuous thread of whatever the hell a program needs next, there's a bunch of mundane grinders being fed by hoppers full of largely-similar tasks. It'll all get done eventually. Adding more hardware won't do any single thing faster, but it'll distribute the workload.

Video cards have recently been pushing the ability to go back to 16-bit operations. It lets them do more things per second. Parallelism has finally won, and increased bit depth is mostly an obstacle to that.

So what 128-bit computing would look like is probably one core on a many-core chip. Like how Intel does mobile designs, with one fat full-featured dual-thread linear shredder, and a whole bunch of dinky little power-efficient task-grinders. Or... like a Sony console with a boring PowerPC chip glued to some wild multi-phase vector processor. A CPU that they advertised as a private supercomputer. A machine I wrote code for during a college course on machine vision. And it also plays Uncharted.

The PS3 was originally intended to ship without a GPU. That's part of its infamous launch price. They wanted a software-rendering beast, built on the Altivec unit's impressive-sounding parallelism. This would have been a great idea back when TVs were all 480p and games came out on one platform. As HDTVs and middleware engines took off... it probably would have killed the PlayStation brand. But in context, it was a goofy path toward exactly what we're doing now - with video cards you can program to work however you like. They're just parallel devices pretending to act linear, rather than they other way around.

[-] mindbleach@lemmy.world 158 points 1 year ago

Hewlett-Packard is just an unhinged ad campaign for Brother.

[-] mindbleach@lemmy.world 74 points 1 year ago

State judiciary: powerless versus their congress.

Federal judiciary: unquestionable.

Get the fuck out of our government, you miserable bastard.

[-] mindbleach@lemmy.world 188 points 1 year ago

Forty-four billion dollars to not have the employees, offices, servers, code, reputation, vetting, or brand.

And his hot new name for this smoldering ruin is... X.

Absolute fucking child.

view more: next ›

mindbleach

joined 1 year ago