I have never used an HDR display before so I’m not sure how it’s supposed to look.

I have been playing Spider-man both with and without HDR and unless I’m staring right into the sun there is literally no difference. I have always heard people talk about HDR as something incredible but I’m honestly disappointed.

I also played Tetris effect: connected and HDR seemed to just make all the menus darker, but the rest looked the same.

Have I done something wrong or is this how it is supposed to be?

  • helenslunch@feddit.nl
    link
    fedilink
    arrow-up
    13
    arrow-down
    10
    ·
    1 year ago

    People make HDR out to be a big deal. In reality it is barely even noticeable. Like going from 4k to 8k or 120 to 240Hz. Most people can’t even tell the difference.

    • Cralder@feddit.nuOP
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Thanks for the input. I got confused when people said Tetris effect looked “sooo much better” with HDR and I wasn’t seeing any difference at all.

      • Telorand@reddthat.com
        link
        fedilink
        arrow-up
        7
        ·
        1 year ago

        HDR, from what I loosely understand, is related to the color gamut (the reds, greens, and blues) the display can produce. The sRGB coverage used on most displays today is the BT 709 standard. HDR is the newer DCI-P3 standard, and it covers a wider range of colors.

        But that’s why games and systems that don’t support those extra colors won’t give you any extra “oomph” on an HDR display (because it’s only coded to utilize the capabilities of an SDR display).

        I recommend this article for further reading: https://tomshardware.com/news/what-is-hdr-monitor,36585.html

        • entropicdrift@lemmy.sdf.org
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          HDR is actually the BT.2020 color gamut. Films mastered in HDR typically use DCI-P3 because that’s the standard for theaters, but it’s a smaller color gamut than BT.2020, which is what even HDR10 (the most common form of HDR with the lowest specs) supports.

          • Telorand@reddthat.com
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            The article I cited says that modern HDR hardware can’t actually reach BT.2020, though that’s the ultimate goal.

            Has that changed?

            • entropicdrift@lemmy.sdf.org
              link
              fedilink
              arrow-up
              3
              ·
              1 year ago

              No, it can’t. Most hardware is targeting DCI-P3 (though some goes beyond it) because that’s what films are targeting in the mastering process, but HDR10 and all other HDR protocols (HDR10+, Dolby Vision, etc) all use the BT.2020 spec on the software side of things.

              In other words, the software is ahead of the hardware for now.

    • BruceTwarzen@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I still don’t know how my HDR works or if. I feel like every time it’s enabled it looks weird. Maybe because it’s different, or maybe i’m doing it wrong but it’s to much of a hassle to play with the settings. I feel like on the tv it’s a bit different where i think it looks better.

      • helenslunch@feddit.nl
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        Honestly in many cases it looks worse, and even the opposite of “high dynamic range”. I don’t get the hype.

        • osprior@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          Likely you haven’t seen it on a good display. It is quite noticeable and a big improvement on a nice display. HDR 400 is not really HDR and not worth running it in that mode if that’s all your display supports.