• Zamundaaa@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    4 days ago

    I don’t actually believe this to be the case, if it was people who use custom ICCs would get extremely wonky results that don’t typically happen

    They wouldn’t, because applying ICC profiles is opt-in for each application. Games and at least many video players don’t apply ICC profiles, so they do not see negative side effects of it being handled wrong (unless they calibrate the VCGT to follow the piece-wise TF).

    With Windows Advanced Color of course, that may change.

    I think I am a bit confused on the laptop analogy then, could you elaborate on it?

    What analogy?

    How monitors typically handle this is beyond me I will admit, But I have seen some really bonkers ways of handling it so I couldn’t really comment on whether or not this holds true one way or another. Just so I am not misinterpeting you, are you saying that “if you feed 300nits of PQ, the monitor will not allow it to go above it’s 300nits”? IF so this is not the case on what happens on my TV unless I am in “creator/PC” mode. In other modes it will allow it to go brighter or dimmer.

    Yes, that’s exactly what happens. TVs do random nonsense to make the image look “better”, and one of those image optimizations is to boost brightness. In this case it’s far from always nonsense of course (on my TV it was though, it made the normal desktop waaay too bright).

    unless I am in “creator/PC” mode

    Almost certainly just trying to copy what monitors do.

    With libjxl it doesn’t really default to the “SDR white == 203” reference from the “reference white == SDR white” common… choice? not sure how to word it… Anyways, libjxl defaults to “SDR white = 255” or something along those lines, I can’t quite remember. The reasoning for this was simple, that was what they were tuning butteraugli on.

    Heh, when it came to merging the Wayland protocol and we needed implementations for all the features, I was searching for a video or image standard that did exactly that. The protocol has a feature where you can specify a non-default reference luminance to handle these cases.

    It is indeed the case that users wont know what transfer function content is using. but they absolutely do see a difference other then “HDR gets brighter then SDR” and that is “it’s more smooth in the dark areas” because that is also equally true.

    That is technically speaking true, but noone actually sees that. People do often get confused about bit depth vs. HDR, but that’s more to do with marketing conflating the two than people actually noticing a lack of banding with HDR content. With the terrible bitrates videos often use nowadays, you can even get banding in HDR videos too :/

    When you play an HDR and an SDR video on a desktop OS side by side, the only normally visible differences are that the HDR video sometimes gets a lot brighter than the SDR one, and that (with a color managed video player…) the colors may be more intense.

    • Quack Doc@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      They wouldn’t, because applying ICC profiles is opt-in for each application. Games and at least many video players don’t apply ICC profiles, so they do not see negative side effects of it being handled wrong (unless they calibrate the VCGT to follow the piece-wise TF).

      With Windows Advanced Color of course, that may change.

      it’s true that many applications normal users will use won’t, but on the flip side, creative types may actually be really familiar with applications that do. But you also do have users who consume it without doing creative things, like MPV and I think madVR too.

      What analogy?

      you talked about how laptops have a brightness configuration that desktops didn’t.

      Yes, that’s exactly what happens. TVs do random nonsense to make the image look “better”, and one of those image optimizations is to boost brightness. In this case it’s far from always nonsense of course (on my TV it was though, it made the normal desktop waaay too bright).

      Almost certainly just trying to copy what monitors do.

      It is true that it can be random for sure, but this is to be expected. While it is useful to keep in mind that PQ is an absolute metric, it is very much intended for displays to pick and choose how they should treat the light. The mastered content is a reference which is why we always talk in “reference nits” when we refer to grading stuff. This behavior is very much to be expected and the user should be able to compensate for it via their own controls. I think that handling PQ is an absolute value is useful on one hand, but fundamentally flawed on the other. indeed, this is one of the short comings of modern operating systems for sure.

      Personally I believe that the way to handle this is after all the other processing is done. PQ should be treated as absolute when doing anything like colorspace conversion. When your “reference” looks correct, then you can compensate for display issues. Though perhaps if you have a user supplied chart of a luminance response level other behavior should be considered.

      That is technically speaking true, but noone actually sees that. People do often get confused about bit depth vs. HDR, but that’s more to do with marketing conflating the two than people actually noticing a lack of banding with HDR content. With the terrible bitrates videos often use nowadays, you can even get banding in HDR videos too :/

      When you play an HDR and an SDR video on a desktop OS side by side, the only normally visible differences are that the HDR video sometimes gets a lot brighter than the SDR one, and that (with a color managed video player…) the colors may be more intense.

      I’m not sure we can remove SDR/HDR from bitrate, most colorspace’s transfers do specify a specific bitdepth, but even then, say you have a 3k nit video which is not actually uncommon thanks to apple, a transfer like sRGB/G2.2 bt.1886/G2.4 will still be inadequate to appropriately displayed. This ofc includes if you were to do inverse tonemapping from “SDR” to “HDR” without appropriate debanding.

      I don’t think one should try to separate bitdepth and a transfer’s intended reference peak luminance from the terms SDR and HDR, because they do play an important role in the display of “SDR” content and “HDR” content. Then again, I am an ardent believer in the death of the terms HDR and SDR. Especially when it comes to anything even remotely technical. To me, SDR and HDR when it related to video is useful for nothing but marketing. Colorspaces are absolute and every video has one (even if it is unknown which is it’s own can of crap to open).

      It’s true that higher bitdepth can alleviate banding in lower luminance regions, but even at a given bitdepth of 8bits, HLG will still have less banding in lower luminance regions then sRGB will, and this can indeed be noticable on high luminance displays that can really pump up the brightness or especially in viewing environments with no ambient light aside from the bounce light of the TV and small lumen stuff like phones and night lights which is an extremely common viewing environment, even if a bad one.

      also, very much agree about the bitrate crap, AV1 and VVC do help… too bad YT killed off it’s good av1 encodes…