• [deleted]@piefed.world
    link
    fedilink
    English
    arrow-up
    56
    ·
    edit-2
    6 hours ago

    The push for lossless seems more like pushback on low bit rate and reduced dynamic range by avoiding compression altogether. Not really a snob thing as much as trying to avoid a common issue.

    The video version is getting the Blu-ray which is significantly better than streaming in specific scenes. For example every scene that I have seen with confetti on any streaming service is an eldritch horror of artifacts, but fine on physical media, because the streaming compression just can’t handle that kind of fast changing detail.

    It does depend on the music or video though, the vast majority are fine with compression.

    • otacon239@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      1
      ·
      6 hours ago

      My roommate always corrects me when I make this same point, so I’ll pass it along. Blu-Rays are compressed using H.264/H.265, just less than streaming services.

        • ColeSloth@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          4
          ·
          4 hours ago

          Or worse. I think it was the original Ninja Turtles movie that I had owned on DVD and the quality of it kind of sucked. Years later I got it on blu ray and I swear they just ripped one of the DVD copies to make the blu ray disc.

          • GnuLinuxDude@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 hours ago

            Sadly, that basically feels like what happened with The Fellowship of the Ring’s theatrical cut blu ray, too. It just doesn’t look that great.

            Then the extended edition has decent fidelity but some bizarro green-blue color grading.

      • cogman@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        4 hours ago

        People don’t like hearing this, but streaming services tune their codecs to properly calibrated TVs. Very few people have properly calibrated TVs. In particular, people really like to up the brightness and contrast.

        A lot of scenes that look like mud are that way because you really aren’t supposed to be able distinguish between those levels of blackness.

        That said, streaming services should have seen the 1000 comments like the ones here and adjusted already. You don’t need bluray level of bits to make things look better in those dark scenes, you need to tune your encoder to allow it to throw more bits into the void.

        • chisel@piefed.social
          link
          fedilink
          English
          arrow-up
          5
          ·
          3 hours ago

          Lmao, I promise streaming services and CDNs employ world-class experts in encoding, both in tuning and development. They have already poured through maximized quality vs cost. Tuning your encoder to allow for more bits in some scenes by definition ups the average bitrate of the file, unless you’re also taking bits away from other scenes. Streaming services have already found a balance of video quality vs storage/bandwith costs that they are willing to accept, which tends to be around 15mbps for 4k. That will unarguably provide a drastically worse experience on a high-enough quality tv than a 40mbps+ bluray. Like, day and night in most scenes and even more in others.

          Calibrating your tv, while a great idea, can only do so much vs low-bitrate encodings and the fake HDR services build in solely to trigger the HDR popup on your tv and trick it into upping the brightness rather than to actuality improve the color accuracy/vibrancy.

          They don’t really care about the quality, they care that subscribers will keep their subscriptions. They go as low quality as possible to cut costs while retaining subs.

          Blu-rays don’t have this same issue because there are no storage or bandwith costs to the provider, and people buying blu-rays are typically more informed, have higher quality equipment, and care more about image quality than your typical streaming subscriber.

          • cogman@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            2 hours ago

            I promise streaming services and CDNs employ world-class experts in encoding

            They don’t really care about the quality

            It’s funny that you are trying to make both these points at the same time.

            You don’t hire world class experts if you don’t care about quality.

            I have a hobby of doing re-encoding blurays to lower bitrates. And one thing that’s pretty obvious is the world class experts who wrote the encoders in the first place have them overly tuned to omit data from dark areas of a scene to avoid wasting bits in that location. This is true of H265, VP9, and AV1. You have to specifically tune those encoders to push the encoder to spend more of it’s bits on the dark area or you have to up the bitrate to absurd levels.

            Where these encoders spend the bitrate in dark scenes is on any areas of light within the scene. That works great if you are looking at something like a tree with a lot of dark patches, but it really messes with a single light person with darkness everywhere. It just so happens that it’s really easy to dump 2mbps on a torch in a hall and leave just 0.1mbps on the rest of the scene.

            That will unarguably provide a drastically worse experience on a high-enough quality tv than a 40mbps+ bluray. Like, day and night in most scenes and even more in others.

            I can tell you that this is simply false. And it’s the same psuedo-scientific logic that someone trying to sell gold plated cables and FLAC encodings pushes.

            Look, beyond just the darkness tuning problem that streaming services have, the other problem they have is a QOS. The way content is encoded for streaming just isn’t ideal. When you say “they have to hit 14mpbs” the fact is that they are forcing themselves to do 14mbps throughout the entire video. The reason they do this is because they want to limit buffering as much as possible. It’s a lot better experience to lower your resolution because you are constantly buffering. But that action makes it really hard to do good video optimizations on the encoder. Ever second of the video they are burning 14mb whether they need those 14mb or not. The way that’d deliver less data would be if they only averaged 14mbps rather than forcing it throughout. Allowing for 40mbps bursts when needed but then pushing everything else out at 1mbps saves on bandwidth. However, the end user doesn’t know that the reason they just started buffering is because a high motion action scene is coming up (and netflix doesn’t want to buffer for more than a few minutes).

            The other point I’d make is that streaming companies simply have a pipeline that they shove all video through. And, because it’s so generalized, these sorts of tradeoffs which make stuff look like a blocky mess happen. Sometimes that blocky mess is present in the source material (The streaming services aren’t ripping the blurays themselves, they get it from the content providers who aren’t necessarily sending in raws).

            I say all this because you can absolutely get 4k and 1080p looking good at sub-bluray rates. I have a library filled with these re-encodes that look great because of my experience here. A decent amount of HD media can be encoded at 1 or 2mbps and look great. But you have to make tradeoffs that streaming companies won’t make.

            For the record, the way I do my encoding is a scene by scene encode using VMAF to adjust the quality rate with some custom software I built to do just that. I target a 95% VMAF which ends up looking just fantastic across media.

        • IronKrill@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 hours ago

          I fail to see where TV calibration comes in here tbh. If I can see blocky artifacts from low bitrate it will show up on any screen unless you turn the brightness down so far that nothing is visible.

          • cogman@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 hours ago

            Blocky artifacts typically appear in low light situations. There will be situations where it might just be blocky due to not having enough bits (high motion scenes) but there are plenty of cases where low light tuning is where you’d end up noticing the blockyness.

            • Cenzorrll@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 hours ago

              Blocky artifacts are the result of poor bitrates. In streaming services it’s due to over compressing the stream, which is why you see it when part of a scene is still or during dark scenes. It’s due to the service cheaping out and sending UHD video at 720p bitrates.

              • cogman@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                1 hour ago

                Look, this is just an incorrect oversimplification of the problem. It’s popular on the internet but it’s just factually incorrect.

                Here’s a thread discussing the exact problem I’m describing

                https://www.reddit.com/r/AV1/comments/1co9sgx/av1_in_dark_scenes/

                The issue at play for streaming services is they have a general pipeline for encoding. I mean, it could be described as cheaping out because they don’t have enough QA spot checking and special purposing encodes to make sure the quality isn’t trash. But it’s really not strictly a “not enough bits” problem.

    • Kabe@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      55 minutes ago

      The thing is, dynamic range compression and audio file compression are two entirely separate things. People often conflate the two by thinking that going from wav or flac to a lossy file format like mp3 or m4a means the track becomes more compressed dynamically, but that’s not the case at all. Essentially, an mp3 and a flac version of the same track will have the same dynamic range.

      And yes, while audible artifacts can be a thing with very low bitrate lossy compression, once you get to128kbps with a modern lossy codec it becomes pretty much impossible to hear in a blind test. Hell, even 96kbps opus is pretty much audibly perfect for the vast majority of listeners.

      • oktoberpaard@piefed.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 hours ago

        In a distant past I liked to compare hires tracks with the normal ones. It turned out that they often used a different master with more dynamic range for the hires release, tricking the listener into thinking it sounded different because of the high bitrate and sampling frequency. The second step was to convert the high resolution track to standard 16 bit 44.1 kHz and do a/b testing to prove my point to friends.