• cogman@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    3
    ·
    4 hours ago

    People don’t like hearing this, but streaming services tune their codecs to properly calibrated TVs. Very few people have properly calibrated TVs. In particular, people really like to up the brightness and contrast.

    A lot of scenes that look like mud are that way because you really aren’t supposed to be able distinguish between those levels of blackness.

    That said, streaming services should have seen the 1000 comments like the ones here and adjusted already. You don’t need bluray level of bits to make things look better in those dark scenes, you need to tune your encoder to allow it to throw more bits into the void.

    • chisel@piefed.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 hours ago

      Lmao, I promise streaming services and CDNs employ world-class experts in encoding, both in tuning and development. They have already poured through maximized quality vs cost. Tuning your encoder to allow for more bits in some scenes by definition ups the average bitrate of the file, unless you’re also taking bits away from other scenes. Streaming services have already found a balance of video quality vs storage/bandwith costs that they are willing to accept, which tends to be around 15mbps for 4k. That will unarguably provide a drastically worse experience on a high-enough quality tv than a 40mbps+ bluray. Like, day and night in most scenes and even more in others.

      Calibrating your tv, while a great idea, can only do so much vs low-bitrate encodings and the fake HDR services build in solely to trigger the HDR popup on your tv and trick it into upping the brightness rather than to actuality improve the color accuracy/vibrancy.

      They don’t really care about the quality, they care that subscribers will keep their subscriptions. They go as low quality as possible to cut costs while retaining subs.

      Blu-rays don’t have this same issue because there are no storage or bandwith costs to the provider, and people buying blu-rays are typically more informed, have higher quality equipment, and care more about image quality than your typical streaming subscriber.

      • cogman@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 hour ago

        I promise streaming services and CDNs employ world-class experts in encoding

        They don’t really care about the quality

        It’s funny that you are trying to make both these points at the same time.

        You don’t hire world class experts if you don’t care about quality.

        I have a hobby of doing re-encoding blurays to lower bitrates. And one thing that’s pretty obvious is the world class experts who wrote the encoders in the first place have them overly tuned to omit data from dark areas of a scene to avoid wasting bits in that location. This is true of H265, VP9, and AV1. You have to specifically tune those encoders to push the encoder to spend more of it’s bits on the dark area or you have to up the bitrate to absurd levels.

        Where these encoders spend the bitrate in dark scenes is on any areas of light within the scene. That works great if you are looking at something like a tree with a lot of dark patches, but it really messes with a single light person with darkness everywhere. It just so happens that it’s really easy to dump 2mbps on a torch in a hall and leave just 0.1mbps on the rest of the scene.

        That will unarguably provide a drastically worse experience on a high-enough quality tv than a 40mbps+ bluray. Like, day and night in most scenes and even more in others.

        I can tell you that this is simply false. And it’s the same psuedo-scientific logic that someone trying to sell gold plated cables and FLAC encodings pushes.

        Look, beyond just the darkness tuning problem that streaming services have, the other problem they have is a QOS. The way content is encoded for streaming just isn’t ideal. When you say “they have to hit 14mpbs” the fact is that they are forcing themselves to do 14mbps throughout the entire video. The reason they do this is because they want to limit buffering as much as possible. It’s a lot better experience to lower your resolution because you are constantly buffering. But that action makes it really hard to do good video optimizations on the encoder. Ever second of the video they are burning 14mb whether they need those 14mb or not. The way that’d deliver less data would be if they only averaged 14mbps rather than forcing it throughout. Allowing for 40mbps bursts when needed but then pushing everything else out at 1mbps saves on bandwidth. However, the end user doesn’t know that the reason they just started buffering is because a high motion action scene is coming up (and netflix doesn’t want to buffer for more than a few minutes).

        The other point I’d make is that streaming companies simply have a pipeline that they shove all video through. And, because it’s so generalized, these sorts of tradeoffs which make stuff look like a blocky mess happen. Sometimes that blocky mess is present in the source material (The streaming services aren’t ripping the blurays themselves, they get it from the content providers who aren’t necessarily sending in raws).

        I say all this because you can absolutely get 4k and 1080p looking good at sub-bluray rates. I have a library filled with these re-encodes that look great because of my experience here. A decent amount of HD media can be encoded at 1 or 2mbps and look great. But you have to make tradeoffs that streaming companies won’t make.

        For the record, the way I do my encoding is a scene by scene encode using VMAF to adjust the quality rate with some custom software I built to do just that. I target a 95% VMAF which ends up looking just fantastic across media.

    • IronKrill@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 hours ago

      I fail to see where TV calibration comes in here tbh. If I can see blocky artifacts from low bitrate it will show up on any screen unless you turn the brightness down so far that nothing is visible.

      • cogman@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        Blocky artifacts typically appear in low light situations. There will be situations where it might just be blocky due to not having enough bits (high motion scenes) but there are plenty of cases where low light tuning is where you’d end up noticing the blockyness.

        • Cenzorrll@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 hours ago

          Blocky artifacts are the result of poor bitrates. In streaming services it’s due to over compressing the stream, which is why you see it when part of a scene is still or during dark scenes. It’s due to the service cheaping out and sending UHD video at 720p bitrates.

          • cogman@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 hour ago

            Look, this is just an incorrect oversimplification of the problem. It’s popular on the internet but it’s just factually incorrect.

            Here’s a thread discussing the exact problem I’m describing

            https://www.reddit.com/r/AV1/comments/1co9sgx/av1_in_dark_scenes/

            The issue at play for streaming services is they have a general pipeline for encoding. I mean, it could be described as cheaping out because they don’t have enough QA spot checking and special purposing encodes to make sure the quality isn’t trash. But it’s really not strictly a “not enough bits” problem.