that’s it that’s the post.

JK, but seriously though. AV1 is incredible and I NEED support for hardware decoding to accelerate and fast. I just shoved an 18GB Blu-ray movie into 5 gigs with room for improvement. I can get stream worthy 1080p60 video at 6000kbps when I need at least double that if I use x264. Even at a ok encode speed, the 1080 6000kbps video on YouTube looks pretty good all things considered - sure my super high bitrate x264 video looks clearer, but it’s also at least double the file size on my disk.

I could probably real-time CPU encode my streams with AV1. I could definitely do it even better with hardware encoding.

AV1 is black magic. It feels wrong.

  • kilorat@pawb.social
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    I haven’t found a good encoder, I used all the ones with ffmpeg, such as libsvtav1 and librav1e, but the results are not as good. I screenshot the frame from the original and the transcoded file, then compare that with h265 and the av1 version has noticeable artifacts, like an object will move across the sky, and it will leave a few pixels behind sometimes. I just assume that the problem is with the encoder, and if I can get a hold of a GPU that does encoding, then I can finally start using it. Until then, I’m using vp9 for the short clips I publish, since that works on Discord and Telegram. Now if only Mastodon would support literally anything except h264/aac then I wouldn’t have to add a h264 version to the pipeline for everything.

    • Yote.zip@pawb.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Hardware encoding is worse quality than software encoding iirc (per filesize). libaom should be the best quality codec but it encodes the slowest I believe? Idk the last time I checked on AV1 was like a couple years ago and the encoders are always in flux. You can use vmaf for a deterministic scoring system for quality if you don’t want to pixel hunt.

    • Senil888@pawb.socialOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      What settings have you been using? I haven’t noticed any issues so long as I’m not concerned with real-time encoding. And yeah, GPU encoding is generally worse than software, it’s just usually way faster.

      EDIT: for reference I’ve been using speed 6 and an RF between 40 and 30, and even in fast-paced scenes like ones in “Puss in Boots: The Last Wish” I can’t notice anything super off. With my real-time recordings best I can do is speed 7 and 6000kbps (maybe higher bitrate), which isn’t quite enough for the 1080p60 fast, colorful gameplay of Splatoon 3 - but even then, I’d need a decently higher bitrate with either x264 or x265, especially GPU encoded.

      IDK his exact settings, but a fur I follow on Mastodon has been able to get 4K Blu-ray rips encoded to AV1 down to a handful of gigabytes, and he reports no noticeable quality problems at speed 6 RF 30.

      • kilorat@pawb.social
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        The settings depends on what I’m doing. If I’m trying to squeeze a lot of video to a small space, I’ll crank up crf. For good quality I was using crf 40 or so. If something is already encoded, simply transcoding will be lossy and be blurrier if you side-by-side them, so for archiving I decided to just always keep the source, even if it’s raw DVD or other wasteful codec. But for making clips and things out of it, I’m trying to use the best thing, that doesn’t waste a lot of space, and is playable by most people. It was awhile ago that I compared encoders, but I settled on using libsvtav1. librav1e seemed to have even less options, and it didn’t seem better, but maybe there is a way to tune it better. What killed it for me was when I had an artifact in an AV1 video that I noticed, and when I encoded the same video with h265 and with same bitrate, it did not have that problem. Eventually one of my subscribers complained they couldn’t view my h265 videos on their phone, so I switched to vp9 for more compatibility, and without having to stoop back to the least common denominator codec that is h264. So now even if I figure out the best encoder library and settings, I can’t use it right now because of compatibility, I’m not just making videos for me to look at. I’ll probably try pushing for AV1 again soon just to see, because it’s inevitable that it will be supported everywhere. I’m sad to hear people saying GPU encoding can’t be better, I hoped there would be an option to make it take longer but do good job at really great quality per byte. I still dont have a GPU that can do AV1 encoding, and I can’t find any articles that show comparisons on the quality of the output, it’s always just about how fast it encodes.

        • Yote.zip@pawb.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          For hardware encoding, it will depend on your hardware encoder. Even for h264 etc, the hardware encoders have always been a step down from software encoding. This is a quick chart of Intel’s AV1 encoder in comparison to other common software and hardware encoders. It’s from this video which iirc was very informative on this topic for the Intel GPUs specifically.

          I see they put out at least a couple more videos on AV1 in general here and here if you’re interested, but I haven’t watched them.