Oh cool, because HDR wasn't complicated enough.
Hardware
All things related to technology hardware, with a focus on computing hardware.
Rules (Click to Expand):
-
Follow the Lemmy.world Rules - https://mastodon.world/about
-
Be kind. No bullying, harassment, racism, sexism etc. against other users.
-
No Spam, illegal content, or NSFW content.
-
Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.
-
Please try and post original sources when possible (as opposed to summaries).
-
If posting an archived version of the article, please include a URL link to the original article in the body of the post.
Some other hardware communities across Lemmy:
- Augmented Reality - !augmented_reality@lemmy.world
- Gaming Laptops - !gaminglaptops@lemmy.world
- Laptops - !laptops@lemmy.world
- Linux Hardware - !linuxhardware@programming.dev
- Mechanical Keyboards - !mechanical_keyboards@programming.dev
- Microcontrollers - !microcontrollers@lemux.minnix.dev
- Monitors - !monitors@piefed.social
- Raspberry Pi - !raspberry_pi@programming.dev
- Retro Computing - !retrocomputing@lemmy.sdf.org
- Single Board Computers - !sbcs@lemux.minnix.dev
- Virtual Reality - !virtualreality@lemmy.world
Icon by "icon lauk" under CC BY 3.0
HDR10+ seems like a much better solution rather than subsidizing Dolby.
You know, the core of this is actually a neat idea. It'd be nice to have ambient light adaptation, a push for native higher framerates and such standardized and built into whatever editors do. TVs do this in a bajillion different ways, all suboptimally because it's not standardized.
It sucks that its Dolby. That its tiered, and all the other nonsense.
...But if no one else has the industry muscle to push it, well, maybe it won't be so awful?
If it’s a standard that’ll only work on new chips and so many people already have TVs I don’t see this being super widely adopted unless there’s backwards compatibility.
Hisense will be the first TV brand to introduce Dolby Vision 2 to its lineup. These TVs will be powered by MediaTek Pentonic 800 with "MiraVision Pro" PQ Engine, the first silicon chip to integrate Dolby Vision 2. Timing and availability will be announced at a later date.
There is nothing about backward compatibility.
My best guess is it’ll fallback or include data for DV1 data in content with DV2 otherwise they are really shooting themselves in the foot lol
I would have thought so too, but one would think they would mention something like this in the press release.
All this "cool" tech but implementation is a bitch...
Atmos and dv1... Most content doesn't support it.
It is still gimmick at this point