this post was submitted on 07 Sep 2025
296 points (96.0% liked)

Technology

75074 readers
3051 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Decq@lemmy.world 76 points 6 days ago (3 children)

I honestly don't get why anyone would have bought an Intel in the last 3-4 years. AMD was just better on literally every metric.

[–] stealth_cookies@lemmy.ca 17 points 5 days ago (1 children)

If your use case benefited from Quicksync then Intel was a clear choice.

[–] PalmTreeIsBestTree@lemmy.world 5 points 5 days ago (1 children)

Older Intel CPUs are the only ones that can play 4K BluRays on the player itself and not just ripping to a drive. Very niche use case but that is one I can think of.

[–] notthebees@reddthat.com 9 points 5 days ago (1 children)

They can't even do that anymore. sgx had a bunch of vulnerabilities and as a result, that service has been disabled.

https://sgx.fail/ SGX.Fail

load more comments (1 replies)
[–] Quatlicopatlix@feddit.org 14 points 5 days ago (1 children)

Idle power is the only thing they are good at, but for a homeserver a used older cpu is good enough.

[–] Decq@lemmy.world 10 points 5 days ago (1 children)

Was that even true for comparable CPU's? I feel this was only for their N100's etc.

[–] Quatlicopatlix@feddit.org 9 points 5 days ago (17 children)

Nah all the am4 cpus have abysmal idle power, the am5 got a little better as far as i know but the infinity fabric was a nightmare for the idle power.

[–] Decq@lemmy.world 14 points 5 days ago

Well I concede, I guess there was one metric they were better at. Doing absolutely nothing.

load more comments (16 replies)
load more comments (1 replies)
[–] KiwiTB@lemmy.world 43 points 6 days ago (26 children)

Looks like they didn't have adequate cooling for their CPU, killed it... Then replaced it without correcting the cooling. If your CPU hits 3 digits, it's not cooled properly.

[–] sugar_in_your_tea@sh.itjust.works 46 points 6 days ago* (last edited 6 days ago) (2 children)

If your CPU hits 3 digits, then throttling isn't working properly, because it should kick in before it hits that point.

[–] frongt@lemmy.zip 27 points 5 days ago (1 children)

The article (or one of the linked ones) says the max design temperature is 105°C, so it doesn't throttle until it hits that.

Which makes me think it should be able to sustain operating at that temperature. If not, Intel fucked up by speccing them too high.

[–] sugar_in_your_tea@sh.itjust.works 13 points 5 days ago (3 children)

I'd expect it to still throttle before getting to 105C, and then adjust to maintain a temp under 105C. If it goes above 105C, it should halt.

[–] frongt@lemmy.zip 17 points 5 days ago (7 children)

Then you misunderstand the spec. That's the max operating temperature, not the thermal protection limit. It throttles at 105 so it doesn't hit the limit at 115 or whatever and shut down. I can't find a detailed spec sheet that might give an exact figure.

load more comments (7 replies)
[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 8 points 5 days ago (3 children)

Why? It’s designed to run up to 105c.

I think it was when AMDs 7000 series CPUs were running at 95c and everyone freaked out that AMD came out and said that the CPUs are built to handle this load 24/7 365 for years on end.

And it’s not like this is new to Intel. Intel laptop CPUs have been doing this for a decade now.

load more comments (3 replies)
load more comments (1 replies)
load more comments (1 replies)
[–] nuko147@lemmy.world 12 points 6 days ago (1 children)

That's not the case. 100% for new CPUs, but also for old ones too.

My father's old CPU cooler did not make good contact, got lose in one corner some how, and the system would throttle (fan at 100% making noise and PC run slow). After i fixed it, in one of my visits, CPU was working fine for years.

System throttles or even shuts down before any thermal damage occures (at least when temperatures rise normally).

[–] lemming741@lemmy.world 6 points 5 days ago

Pretty much anything with a heat spreader should be impossible to accidentally kill. Bare die? May dog have mercy on your soul.

load more comments (24 replies)
[–] Knossos@lemmy.world 36 points 6 days ago (1 children)

I built a new PC recently. All I needed to see were the benchmarks over the last 5 years. There's currently no contest.

load more comments (1 replies)
[–] Vanilla_PuddinFudge@infosec.pub 21 points 6 days ago (4 children)

"Do you need to transcode video?

Then leave Intel the fuck alone."

Been my rule for 20 years, and it's worked good so far.

[–] muusemuuse@sh.itjust.works 23 points 5 days ago (3 children)

It’s odd, their GPUs are doing fine, a market they are young in, but their well established CPU market is cratering

Business majors suck.

[–] TheGrandNagus@lemmy.world 7 points 5 days ago (1 children)

Sure, if by doing fine you mean looking alright in benchmarks while having zero supply because they don't make money selling them and thus don't want to produce them in any significant amount.

load more comments (1 replies)
[–] KingRandomGuy@lemmy.world 11 points 5 days ago

Their GPU situation is weird. The gaming GPUs are good value, but I can't imagine Intel makes much money from them due to the relatively low volume yet relatively large die size compared to competitors (B580 has a die nearly the size of a 4070 despite being competing with the 4060). Plus they don't have a major foothold in the professional or compute markets.

I do hope they keep pushing in this area still, since some serious competition for NVIDIA would be great.

load more comments (1 replies)
load more comments (3 replies)
[–] Fizz@lemmy.nz 13 points 5 days ago (5 children)

I'd probably just warranty the CPU and assume it was a defect instead of blame the entire company.

But yeah amd is the better choice for everything atm except x86 power efficiency laptop chips.

[–] BeardedGingerWonder@feddit.uk 3 points 4 days ago* (last edited 4 days ago)

Which is fine. Potentially part of the huge known issue with the last couple of generations of Intel chips which affected a huge swath of CPUs, fixes have been released, but damage has been done - that alone would make me dubious about them going forward.

The more immediate issue though is, my CPU failed, I need to find some time to take the PC apart, safely box up the the CPU, figure out the intel rma procedure, ship it off, wait for intel to assess the cpu, hope they accept responsibility, ship me a new CPU and then find the time, once again, to take the PC apart to put the CPU back in. Twice. And I've been without PC for the entire time. And they most likely knew about the issues before the second gen of defective chips they launched. And it's not even the better chip as you mention. I'd be sufficiently pissed off to stay away.

load more comments (4 replies)
[–] ArmchairAce1944@discuss.online 5 points 4 days ago (1 children)

The computer I bought should last me about 10 years. I spent a fuckload of money on it. The next comp will have to be done entirely with as little starting google and privacy violating shit as possible.

And I am certain AMD will make better stuff by then.

[–] fleck@lemmy.world 2 points 4 days ago (1 children)

I'm still rocking an i7 4790k and its >10 years old! Judging from the other comments it seems the intel issue is more of a recent one though. If I ever configure a new PC, I'll check out AMD for sure.

[–] modus@lemmy.world 2 points 4 days ago

I'm still using the same chip on an Asus mobo. No problems here.

[–] 3dcadmin@lemmy.relayeasy.com 12 points 5 days ago (1 children)

It was ok until he said the AMD chip consumed more power. It is a X3D chip that is pretty much a given, if he'd gone for a none X3D chip he'd have saved quite a bit of power especially at idle. Plus he seems to use an AMD chip like an Intel chip with little or no idea how to tweak its power usage down

[–] xthexder@l.sw0.com 5 points 5 days ago (2 children)

I've got a 9700X and it absolutely rips at only 65W

load more comments (2 replies)
[–] DicJacobus@lemmy.world 5 points 5 days ago

I was an intel guy most of my life, Intel on all the hand-me-downs I got from my grandfather's appliance store, Intel on my first gaming PC in 2008 til 2012, Intel on the 2012-2019 PC, it wasn't until I built my current PC in 2019 that I Switched because of the Meltdown / Spectre / Etc issues, largely just out of reputation not actually understanding them.

Sufficed to say, I left in 2019 and have had no reason to return.

Interesting, so it's not only their recent-ish (either 12th or 13th gen and up, iirc) laptop CPUs that die under normal load.

[–] zr0@lemmy.dbzer0.com 9 points 6 days ago

I knew Michael Stapelberg from other projects, but I just realized he is the author of the i3 Window Manager. Damn!

[–] RememberTheApollo_@lemmy.world 3 points 4 days ago* (last edited 4 days ago) (2 children)

I’ve swapped back and forth between brands since I built my first computer almost 30 years ago. It was intel forever until AMD showed up with their early Athlons, amazing CPUs for the price. Then Intel fought back with with their Core 2 Quads, AMD with Thunderbirds, back to intel with their higher i-series, up until about 2-3 years ago and now AMD’s Ryzen offering the better performance/$ again. It’s too bad intel seems to be unable to keep costs competitive and maintain quality. I’ve never had a CPU quit on me yet (knock on wood). Motherboards, RAM, PSUs, sure. I used to partial upgrade every 2 years or so, but the golden era of PC building is gone. The high prices of GPU’s alone really killed the momentum we had from say ‘05-‘15.

[–] kieron115@startrek.website 2 points 4 days ago

I know this is sort of still doable with aliexpress kits, but I miss the days of being able to make "weird" builds. My first build was an Athlon XP-M 2500+. It was a mobile chip that was just a binned desktop chip. It used the same socket as desktop, had no IHS, and ran at a "lower voltage" thanks to the binning. Overclockers DREAM in back in like 2005.

[–] Soup@lemmy.world 2 points 4 days ago

Intel’s strategy seems to be just chugging power into the CPU and hoping for the best.

It feels kinda like there’s a race and one person’s breathing hard and sweating bullets only to have another runner breeze past them like it’s nothing.

[–] SapphironZA@sh.itjust.works 6 points 5 days ago (3 children)

Just for interest. Why did you buy Intel in the first place. I don't know about many use cases where Intel is the superior option.

load more comments (3 replies)
load more comments
view more: next ›