Thursday 11 December 2014

Why 1080p Internet TV Is A Scam

While on our work Christmas night out yesterday (yummy beer at Das Kino and Brewdog and yummier food at Tarn Thai in Nottingham) the subject of streaming Internet TV services came up in conversation and I tried (not helped by 3 or 4 pints of Vedett IPA) to explain why 1080p is a scam from most streaming providers such as Netflix. My poor alcohol addled brain wasn't up to the maths so I re-did some "back of a fag packet" calculations today to make it clearer (as much for my own benefit as anyone else's).

If you display an SD (of the sake of argument, 704x576) stream on most modern big screens it will look very blocky (unless you've got a good set with a decent upscaler). As such most online video services now offer 720p streams that look almost as good as broadcast DVB-T2 (although not a patch off what you'll see on a well authored Bluray, which can go up to 40Mb/s video rates, ten times what you'll see on most Internet streamed video). The real deciding factor in picture quality isn't just the resolution, but the bitrate, or how much data there is available to describe the changes in the image over a period of time. The two go hand in hand and, unfortunately, when you move up to 1080p the data requirements are *huge*. If you don't give the stream enough bitrate the compressor will drop detail and "flatten" out the image to make up for it.

The streaming companies just can't afford to do 1080p at the same picture quality, let alone better, than 720p. Netflix already account for around 34% of North American Internet traffic and they're having all sorts of problems with Net Neutrality and their data transmitting ISPs to their customers.

Let's do some quick maths examples to better illustrate this:

A 720p image is actually 1280x720 pixels, making for a total of 921,600 pixels
A 1080p image is 1920x1080 pixels, a total of 2,073,600 pixels

That means a 1080p image has 2.25 times more pixels in every frame than a 720p image (and a whopping 5.11 times more than a 576p SD frame!).

A typical streamed TV show in 720p may have a video bitrate of around 3.8Mb/s, that is: there are an average 3.8 Megabits of information available to describe all the changes in the image on screen each second (typically accounting for 50 frames in the UK, or 60 in the US). So what happens if we up the resolution to 1080p, multiplying the number of pixels by around 2.25? Do we double the bitrate, too? You'd like to think so, wouldn't you?

Let's look at my "typical" show again, this time the 1080p version. The streaming service has encoded it with a video bitrate of 4.8Mb/s. Great! More bits per second, more picture quality! Not quite...we've gone from around nine hundred and twenty thousand to over two million pixels per frame increasing by a factor of 2.25, but we've only upped the bitrate by a measly factor of 1.26

You're hopefully way ahead of me at this point, but let's use some more spurious maths and try to illustrate this with a made up value of "bits per pixel per second":
   
3800000 / (1280 x 720)   = 4.12
4800000 / (1920 x 1080) = 2.31

It should be pretty obvious from the above that, taking into account the spatial resolution of the frame, a 3.8Mb/s 720p stream has a lot more (1.78 times more) data available to it than a comparable 4.8Mb/s stream at 1080p.

If you have a 1080p TV set I'd *strongly* suggest setting your maximum streaming resolution to 720p if you can. Trust me, it'll look much better *and* you'll actually use less data (useful if you're on a capped connection).

(NB. apologies to anyone who actually understands colour spaces, frame rates, key frames and how MPEG encoding really works. This article was meant as a blunt instrument to illustrate a point, not as a lecture on the nuts and bolts of broadcast TV encoding).

No comments:

Post a Comment