this post was submitted on 02 Sep 2025
15 points (100.0% liked)

Hardware

3664 readers
109 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
top 8 comments
sorted by: hot top controversial new old
[–] newthrowaway20@lemmy.world 10 points 10 hours ago

Oh cool, because HDR wasn't complicated enough.

[–] Alphane_Moon@lemmy.world 8 points 10 hours ago

HDR10+ seems like a much better solution rather than subsidizing Dolby.

[–] brucethemoose@lemmy.world 5 points 9 hours ago* (last edited 9 hours ago)

You know, the core of this is actually a neat idea. It'd be nice to have ambient light adaptation, a push for native higher framerates and such standardized and built into whatever editors do. TVs do this in a bajillion different ways, all suboptimally because it's not standardized.

It sucks that its Dolby. That its tiered, and all the other nonsense.

...But if no one else has the industry muscle to push it, well, maybe it won't be so awful?

[–] JohnWorks@sh.itjust.works 4 points 10 hours ago (1 children)

If it’s a standard that’ll only work on new chips and so many people already have TVs I don’t see this being super widely adopted unless there’s backwards compatibility.

[–] Alphane_Moon@lemmy.world 5 points 10 hours ago (1 children)

Hisense will be the first TV brand to introduce Dolby Vision 2 to its lineup. These TVs will be powered by MediaTek Pentonic 800 with "MiraVision Pro" PQ Engine, the first silicon chip to integrate Dolby Vision 2. Timing and availability will be announced at a later date.

There is nothing about backward compatibility.

[–] JohnWorks@sh.itjust.works 4 points 10 hours ago (1 children)

My best guess is it’ll fallback or include data for DV1 data in content with DV2 otherwise they are really shooting themselves in the foot lol

[–] Alphane_Moon@lemmy.world 3 points 9 hours ago* (last edited 9 hours ago)

I would have thought so too, but one would think they would mention something like this in the press release.

[–] sunzu2@thebrainbin.org 3 points 9 hours ago

All this "cool" tech but implementation is a bitch...

Atmos and dv1... Most content doesn't support it.

It is still gimmick at this point