I think resolution comes with an advantage over posting bitrates - in any scenario where you’re rendering a lower resolution video on a higher resolution surface, there will be scaling with all of its negative consequences on perceived quality. I imagine there’s also an intuitive sense of larger resolution = higher bitrate (necessarily, to capture the additional information).
there will be scaling with all of its negative consequences on perceived quality
In theory this is true. If you had a nice high-bitrate 1080p video it may look better on a 1080 display than any quality of 1440p video would due to loss while scaling. But in almost all cases selecting higher resolutions will provide better perceived quality due to the higher bitrate, even if they aren’t integer multiples of the displayed size.
It will also be more bandwidth efficient to target the output size directly. But streaming services want to keep the number of different versions small. Often this will already be >4 resolutions and 2-3 codecs. If they wanted to also have low/medium/high for each resolution that would be a significant cost (encoding itself, storage and reduction in cache hits). So they sort of squish the resolution and quality together into one scale, so 1080p isn’t just 1080p it also serves as a general “medium” quality. If you want “high” you need to go to 1440p or 2160p even if your output is only 1080.
I think resolution comes with an advantage over posting bitrates - in any scenario where you’re rendering a lower resolution video on a higher resolution surface, there will be scaling with all of its negative consequences on perceived quality. I imagine there’s also an intuitive sense of larger resolution = higher bitrate (necessarily, to capture the additional information).
In theory this is true. If you had a nice high-bitrate 1080p video it may look better on a 1080 display than any quality of 1440p video would due to loss while scaling. But in almost all cases selecting higher resolutions will provide better perceived quality due to the higher bitrate, even if they aren’t integer multiples of the displayed size.
It will also be more bandwidth efficient to target the output size directly. But streaming services want to keep the number of different versions small. Often this will already be >4 resolutions and 2-3 codecs. If they wanted to also have low/medium/high for each resolution that would be a significant cost (encoding itself, storage and reduction in cache hits). So they sort of squish the resolution and quality together into one scale, so 1080p isn’t just 1080p it also serves as a general “medium” quality. If you want “high” you need to go to 1440p or 2160p even if your output is only 1080.