• banghida@lemm.ee
    link
    fedilink
    arrow-up
    21
    arrow-down
    4
    ·
    8 hours ago

    One day someone will explain to us what HDR is and why we should care.

    • ikidd@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      2 hours ago

      If you’re the type of person (like me) that can name at most 4 colors, then you don’t care.

    • Devorlon@lemmy.zip
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      15
      ·
      8 hours ago

      Currently most monitors use 16bits for colour (65,536 different possible colours).

      The human eye can see about 10,000,000.

      HDR / True colour is 24bits, 16,777,216 colour variations, which is more than what humans can see.

      You should care because it means images on your device will look true to life, especially as screens get brighter materials like gold will look much nicer.

      • Phoenixz@lemmy.ca
        link
        fedilink
        arrow-up
        5
        ·
        59 minutes ago

        Eh, no?

        You got 8bits for RGB each, so 8x3=24 bits, not 16

        HDR will give you better brightness and contrast levels, and will give a slightly wider color gamut. Visually, you WILL notice the difference

      • Zamundaaa@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        27
        ·
        7 hours ago

        That’s not right. Most monitors use 8 bits per color / 24 bits per pixel, though some are still using 6 bpc / 18bpp.

        HDR doesn’t mean or really require more than 8bpc, it’s more complicated than that. To skip all the complicated details, it means more brightness, more contrast and better colors, and it makes a big difference for OLED displays especially.

      • accideath@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        6 hours ago

        That’s incorrect. While it can be assumed that HDR content supports at least 10bit colour, it is not necessary for monitor or content. The main difference is contrast and brightness. SDR is mastered for a brightness of 100 nits and a fairly low contrast. HDR is mastered for brighnesses of usually 1000 or even 2000 nits since modern displays are brighter and capable of higher contrast and thus can produce a more lifelike picture through the additional information within HDR.

        Of course you need a sufficiently bright and/or contrasty monitor for it to make a difference. An OLED screen or displays with a lot of dimming zones would produce the best results there. But even a 350nit cheap TV can look a bit better in HDR.

    • YourMomsTrashman@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      8 hours ago

      I have a laptop with HDR and back when I was still using Windows I don’t think I’ve ever used it either. It felt like the hardware equivalent to those programs that add screenspace shaders over games lol. Maybe if I played a game or watched a movie that supports HDR I’d change my mind but right now I am clueless. Maybe with the new GNOME

      • bassomitron@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        51 minutes ago

        HDR in movies/TV shows that support it look phenomenal on a good monitor/TV, especially if you have OLED. Games, on the other hand, I have yet to see any that actually look improved with it.

  • Mwa@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 hours ago

    Will Cinnamon merge this with Muffin (Hopefully since ik you can use Gamescope to also get HDR)

  • Mactan
    link
    fedilink
    arrow-up
    2
    ·
    7 hours ago

    how does that work if the Wayland color management thing still isn’t merged?

    • IrritableOcelot@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      3 hours ago

      No? It’s better. HDR just means that there are more color values for each channel. On something like an OLED, that’s more important since the range between white and black is larger in terms of brightness, so to get good color resolution you need more color data.

      • ColdWater@lemmy.ca
        link
        fedilink
        arrow-up
        2
        ·
        7 hours ago

        So HDR make picture brighter? I thought it make color more vibrant that’s why I thought it’s useless on OLED XD

        • accideath@lemmy.world
          link
          fedilink
          arrow-up
          6
          ·
          5 hours ago

          No, HDR can’t make your monitor brighter than it is. But it can take full advantage of the brightness and contrast of modern displays in a way SDR cannot. In almost every case HDR looks better than SDR but brighter and/or more contrasty displays take the most advantage.

          In a more technical sense, SDR content is mastered with a peak brightness of 100 nits in mind. HDR is mastered for a peak brightness of 1000 nits, sometimes 2000 nits and the resulting improved contrast.

          If you don’t watch movies in HDR on a modern TV, you’re not taking full advantage of its capabilities.

    • Vik@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 hours ago

      OLED is practically contingent on HDR colour space for optimal experience. You wouldn’t want to limit yourself to SDR on that type of display.