"Set for a year-end release, AV2 is not only an upgrade to the widely adopted AV1 but also a foundational piece of AOMedia’s future tech stack.
AV2, a generation leap in open video coding and the answer to the world’s growing streaming demands, delivers significantly better compression performance than AV1. AV2 provides enhanced support for AR/VR applications, split-screen delivery of multiple programs, improved handling of screen content, and an ability to operate over a wider visual quality range. AV2 marks a milestone on the path to an open, innovative future of media experiences."
Looking ahead, 53% of AOMedia members surveyed plan to adopt AV2 within 12 months upon its finalization later this year, with 88% expecting to implement it within the next two years.
From AOMedia website. So the plan is for it to have AV1 levels of adoption by 2028.
AV1 was mid. Extremely slow encoding and minor performance gains over H265. And no good encoders on release.
H266 was miles ahead but that is propriatary like 265. So win some lose some.
Compression and efficiency is often a trade-off. H266 is also much slower than AV1, under same conditions. Hopefully there will come more AV1 hw encoders to speed things up… but at least the AV1 decoders are already relatively common.
Also, the gap between h265 and AV1 is higher than between AV1 and h266. So I’d argue it’s the other way around. AV1 is reported to be capable of ~30-50% bitrate savings over h.265 at the cost of speed. H266 differences with AV1 are minor, it’s reported to get a similar range, but more balanced towards the 50% side and at the cost of even lower speed. I’d say once AV1 encoding hardware is more common and the higher presets for AV1 become viable it’d be a good balance for most cases.
The thing is that h26x has a consortium of corporations behind with connections and an interest to ensure they can cash in on their investment, so they get a lot of traction to get hw out.
So… a lot more people now have :
- 4G/5G on the go and proper broadband at home and office and even in unique location (sadly via MuskSat for now…) other ways to get data
- very capable devices in mobile phones, (mostly Android) clients e.g. video projector or dongles, of course computers
- human eyes… that can’t really appreciate 4K on average
… so obviously we should NOT stop looking for more efficient ways and new usages but I’m also betting that we are basically reaching diminishing return already. I don’t think a lot of people care anymore about much high screen resolution or frequency for typical video streaming. Because that’s the most popular usage I imagine everything else, e.g XR, becomes relative to it niche and thus has a hard time benefiting as much from the growth in performances we had until now.
TL;DR: OK cool but aren’t we already flattening the curve on the most popular need anyway?
It’s not for the end user at this point, it’s for YouTube/streaming companies to spend less on bandwidth at existing resolutions. Even a 5% decrease in size for similar quality could save millions in bandwidth costs over a year for YouTube or Netflix.
Thanks for the clarification, it makes me wonder though, is it bandwidth saving at no user cost? i.e is the compression improved without requiring more compute at the end to decompress?
Without hardware decoding, it will take more compute to decompress, but sites usually wait to fully roll out new codecs until hardware decoding is more ubiquitous, because of how many people use low-powered streaming sticks and Smart TVs.
Then it’s arguably delegating some of the cost to the final user, large streaming companies spending a bit less on IXP contracts while viewers have to have newer hardware that might need a bit more energy too to run.
On the upside, the end user needs to use up less data for the same content. This is particularly interesting under 4G/5G and restrictive data plans, or when accessing places / servers with weak connection. It helps avoid having to wait for the “buffering” of the video content mid-playback.
But yes, I agree each iteration has diminishing returns, with a higher bump in requirements. I feel that’s a pattern we see often.
This! Also there’s AI upscaling, if good enough it could (in theory) make a 1080p video show with a 4k quality only very few lucky and healthy young people would be able to tell apart. In the meantime, my eyesight progressively gets worse with age.
sigh
I literally just bought a 3050 for my NAS for AV1 support.
/wrists
you didn’t do the wrong thing.
what many people don’t notice is that support for a codec in gpu(in hardware) is two part. one is decoding and one is encoding.
for quality video nobody does hardware encoding (at least not on consumer systems linux this 3050 nvidia)
for most users the important this is hardware support for decoding so that they can watch their 4k movie with no issue.
so you are in the clear.
you can watch av1 right now and when av2 becomes popular enough to be used in at least 4 years from now.
God damn it, not again
The main thing I want is small file size for the quality. Netflix, YouTube, and me agree on that.
Most of my stuff is AV1 today even though the two TVs I typically watch it on do not support it. Most of the time, what I am watching is high-bitrate H.264 that was transcoded from the low-bitrate AV1.
I will probably move to AV2 shortly after it is available. At least, I will be an early adopter. The smaller the files the better. And, in the future when quality has gone up everywhere, my originals will play native and look great.
I want to agree with you and I do to a large extend. I like new codecs and having more opensourcy coded is better than using a codec that has many patents. long term patents(current situation) slows technological progress.
what I don’t agree with you is some details.
first, Netflix youtube and so on need low bitrate and they (specially google/youtube) don’t care that much about quality. google youtube video are really bit starved for their resolutions. netflix is a bit better.
second, many people when they discuss codecs they are referring to a different use case for them. they are talking about archiving. as in, the best quality codec at a same size. so they compare original (raw video, no lossy codec used) with encoded ones. their conclusion is that av1 is great for size reduction, but cant beat h264 for fidelity at any size. I think that h264 has a placebo or transparent profile but av1 doesn’t.
so when I download a fi…I mean a linux ISO from torrents, I usually go for newest codec. but recently I don’t go for the smallest size because it takes away from details in the picture.
but if I want to archive a movie (that I like a lot, which is rare) I get the bigger h264 (or if uhd blueray h265).
third: a lot of people’s idea of codec quality is formed based on downloading or streaming other people’s encoded videos and they themself don’t compare the quality (as they don’t have time or a good raw video to compare).
4th: I have heard av1 has issues with film grain, as in it removes them. film grain is an artifact of physical films (non-digital) that unfortunately many directors try (or used to) to duplicate because they grew up watching movies on films and think that movies should be like so they add them in in post production. even though it is literally a defect and even human eyes doesn’t duplicate it so it is not even natural. but this still is a bug of av1 (if I read correctly) because codec should go for high fidelity and not high smoothness.
AV1 has issues with film grain. There are things you can do. Let me admit however that one movie that I have not encoded as AV1 is a restored version of the original Star Wars. And film grain is a contributor to that.
Another thing about film grain though is that it is often artificially added after as you say. With AV1, you can often get amazing compression that removes the grain as a side-effect and then just add it back yourself. To each their own how they feel about this approach.
I also agree that H.264 can be more transparent. However, that is at massive file sizes. Others may have the space for that but I do not… Perhaps I do mot have the eyes for it either. I am not extracting and comparing single frames. To me, the AV1 files that I have look better at the size that I am archiving than they would using any other codec.
I use the fact that massive bit rate H.264 looks great to my advantage as that is what my AV1 is being transcoded into when I watch it most of the time.
Some content compresses better than others. Sometimes I get massive size reductions with AV1 at what looks like great quality to me. Other times, it struggles to beat H.265 or even H.264 at similar quality. It is pretty rare that I do not choose AV1 though.
I often use Netflix VMAF to get an idea of target compression. It is not perfect though. You have to verify visually. Saves time trialing different parameters though.
I should say that the audio codec is another big factor. I typically pair AV1 with Opus audio and the size reductions there are amazing even at quality levels that are transparent to me.
If AV2 offers better quality at the same size, or similar quality at smaller sizes, I will likely switch to it long before having hardware that can play it natively.
oh dont get me wrong. as I said I agree with most of your original (and now second post).
my gripe with grain was not about av1 per se. it was with movie makers that add it just because they think it is how movies should be
this is retarded to me: “Reasons to Keep Film Grain On: Artistic Effect: Film grain can add a nostalgic or artistic quality to video and photography, evoking a classic film look” because the reason is just “nostalgic” that the director has, as in if he was born after digital era, he would have an issue with it and not add it (usually).
about h264 and transparency, the issue is not that h264 can get that but at high bitrate, the issue is that av1 (as I read) can’t get it at any bitrate.
but overall I agree with you.
I even recently was shocked to see how much faster av1 encoding has gotten. I would have thought it was still orders of magnitude, but with some setting (like x265 slow setting) av1 is has the same encoding speed.





