[-] MudMan@fedia.io 2 points 3 hours ago

We had enough of them at a time that "the expats" was a relevant group of people you needed to refer to for specific things. Language lessons, HR support, what have you. I definitely heard the anglo guys refer to themselves as that frequently, and that then became the word people used.

I had a chip on my shoulder about telling people I was a migrant, but I was pretty alone on that. The anglo guys mostly said they were "expats".

[-] MudMan@fedia.io 2 points 3 hours ago

I've definitely seen it used for non-white coworkers and coworkers from other regions, but typically in the context of relocating for corporate work.

But then, I worked for a western corpo but with a ridiculously diverse group of people during that time.

[-] MudMan@fedia.io 9 points 4 hours ago

It was used colloquially, for sure... by rich corporate migrants that didn't want to self-ID as migrants. Or at least by the HR people and corpo consultants handling the international relocations and avoding the taboo word.

Which is what the previous post is saying and it certainly matches my experience as one of the "expats". I always self-identified as a migrant myself, though.

[-] MudMan@fedia.io -2 points 5 hours ago

Screw that. I am forced to deal with US politics and culture in enough areas of my life to be shamed for refusing to care about their self-harming tendencies. I don't have a need to care about what the US do to themselves in the same way I don't have a need to care about what Argentina or Hungary or Russia do to themselves. At least Russians don't have a real choice.

Admittedly, I did have the compulsion to write that down here at all, as opposed to those other examples. In my defense, that's because a) I literally wrote that as I clicked the "block" button in this community, and b) it's insanely hard to not pay attention to the US. It requires active effort. This community isn't even called "US politics", it's just called "Politics". The US dominating my media is the default stance of the world, I have to take aggressive action to make that not be the case.

[-] MudMan@fedia.io 0 points 5 hours ago

Yeah, my problem is that this was made to coincide with the Snapdragon Windows PCs, which are really good at a bunch of stuff and specifically not good at NPU performance, so the result of the "AI" branding ends up being really disappointing.

We could talk about all the other growing pains and the ways those devices were covered, but the obsessive focus on "AI" certainly didn't help, as demonstrated by the bizarre reporting linked in the OP.

[-] MudMan@fedia.io 1 points 5 hours ago

That is a weird proposal.

It's definitely weird that everyone is panicking about data center processing costs but not about the exact same hardware powering high end gaming devices that have skyrocketed from 100W to 450W in a few years, but ultimately if you want to run a model locally you can run a model locally. I'm not sure how you'd regulate that, it's just software.

Hell, I don't even think distributing the load is a terrible idea, it's just that the models you can run locally in 40 TOPS kinda suck compared to the order of magnitude more processing you get on modern GPUs.

[-] MudMan@fedia.io 0 points 5 hours ago

That's fair, I hadn't considered the scenario of a bunch of old GOG-supported games needing updates.

I mean, in my defense that's because a lot of the older catalogue is just running under DosBox, but there's definitely more finicky stuff in there as well.

[-] MudMan@fedia.io 4 points 8 hours ago

The idea is having tensor acceleration built into SoCs for portable devices so they can run models locally on laptops, tablets and phones.

Because, you know, server-side ML model calculations are expensive, so offloading compute to the client makes them cheaper.

But this gen can't really run anything useful locally so far, as far as I can tell. Most of the demos during the ramp-up to these were thoroughly underwhelming and nowhere near what you get from server-side services.

Of course they could have just called the "NPU" a new GPU feature and make it work closer to how this is run on dedicated GPUs, but I suppose somebody thought that branding this as a separate device was more marketable.

[-] MudMan@fedia.io 9 points 8 hours ago

Well, no shit.

I've been phasing out US channels from my social media and I think it's time to block Lemmy politics and other US-focused politics discussion from here as well. I don't have much compassion for what Americans will endure the next however many years, but man, it does suck for everybody else.

[-] MudMan@fedia.io 11 points 8 hours ago

Yeah, I've been confused about this. They are basically branding the games they don't own but are supporting out of pocket, if I understand correctly.

So no, they don't own Resident Evil 1, 2 and 3, but they did the work to make them run on modern PCs, so they are now flagging them as part of their preservation program. I don't think it goes beyond that, but it's useful to have a flag for them, I suppose. It may make it easier to sell the idea to publishers or whatever.

[-] MudMan@fedia.io 7 points 8 hours ago

The stupid difference is supposed to be that they have some tensor math accelerators like the ones that have been on GPUs for three generations now. Except they're small and slow and can barely run anything locally, so if you care about "AI" you're probably using a dedicated GPU instead of a "NPU".

And because local AI features have been largely useless, so far there is no software that will, say, take advantage of NPU processing for stuff like image upscaling while using the GPU tensor calculations for in-game raytracing or whatever. You're not even offloading any workload to the NPU when you're using your GPU, regardless of what you're using it for.

For Apple stuff where it's all integrated it's probably closer to what you describe, just using the integrated GPU acceleration. I think there are some specific optimizations for the kind of tensor math used in AI as opposed to graphics, but it's mostly the same thing.

[-] MudMan@fedia.io 16 points 8 hours ago

This is such a hilarious bit of branding nonsense. There is no such thing as "AI PCs".

I mean, I technically own one, in that the branding says I do and it has a Copilot button, but... well, that's definitely not why I purchased it and I don't think I've used an "AI feature" on it. I'm not even opinionated against them, I have run local LLMs in my other computers, it's just not a good application for the device I own that is specifically branded for "AI".

The stupidity of it is that my "AI device" is an ARM device, and I absolutely love the things ARM Windows does that are actually useful. I pulled up my old x64 device that I used before I got this and man, the speed of Windows Hello, how much better it handles video streams, the efficiency... I'd never go back for a portable device at this point.

But the marketing says it's "AI", so once people start telling each other that "AI PCs" are bad and new AMD and Intel "AI" CPUs are released it's anybody's guess how the actually useful newer Windows ARM devices will fare.

I'm still hoping that the somewhat irrational anger towards "AI" stuff subsides so we can start talking about real features now, because man, this has been a frustrating generation to parse for portable Windows devices, and we still have Android, iOS and Mac devices coming down the pipe with similar branding nonsense.

view more: next ›

MudMan

joined 8 months ago