Let’s just first start by saying that I’m not an avid football viewer. I’ll watch the odd World Cup game but that’s about it, but I do care about the quality of the picture that everyone sees. The other weekend I was at the house of a friend of mine and he was watching a game on his relatively new Sony UHD TV (which he bought on a recommendation from me). I sat down to watch the game and within about five seconds I noticed that there was something seriously amiss with the picture.
I drew a breath and mentioned to him that there were some picture issues (that I had identified were being caused by aggressive motion interpolation). His immediate response was that this was because it was streamed via the Now TV app from a Playstation 4. I knew this wasn’t the cause, but then I pointed out the individual problems and he concurred and said I should have a look at the settings at halftime. I almost had to shut my eyes for the remaining 40 minutes – it was so bad. So, at halftime I dove into the menus (I have a similar TV) and it was a relatively simple thing to sort out with a few menu adjustments, which is not really the point. The point is that both he and his father, who was watching at the same time, had not seen these issues – or is it that they had got so used to seeing them, that they were not seen as a problem? This is more worrying as this must have been going on since they bought the TV a few months previously.
Not just a soap opera effect
Now, just to be clear, this wasn’t a case of what is commonly termed ‘the soap opera effect’, no, this was much worse. The picture had severe motion judder, the picture would drop in and out of progressive mode and there was objectionable distortion around movement, especially bad in closeups. I have to admit that I probably notice issues like this more than most. I’ve been trained to look at problems and point out the tiniest thing, but this should have been quite obvious. Is this widespread? Are average viewers not noticing these problems? Have they grown accustomed to sub standard images? I know that, really, in an ideal world all TVs should be calibrated and that would pick up issues like this as well as getting the best greyscale, gamma and colour performance... but let’s face it: on a mid price TV, calibration that costs about half the purchase price is not going to be on most people’s radar.
I lay about 60% of the blame for these issues at the feet of the manufacturers and the rest at the user. Yes, the user should know what good pictures look like but also the manufacturer should not allow user presets like ‘Vivid’ that destroy the picture... and don’t get me started on the default luminous colour. There is another side to this: how many people pay for a calibration and then go in later and change things? After all, it is their TV. Quite a few I bet. People know what they like and that might not be ‘as the director intended’ but they paid the money for their TV and surely they should have it set the way that they want...it does happen. Even if the calibrator has locked out changes it’s easy to find the pass codes online.
Subjective picture quality
Picture quality can be subjective, and I’ve previously mentioned my dislike for added grain. There are of course documented standards though that should at the very least be a starting point for high quality pictures and assigned to presets like ‘cinema’ in the TV menu. If someone decides that they don’t really like the picture looking like that then of course they can change it, but my approach is to always copy the settings to another preset like ‘custom’ or ‘standard’ and then change that one, especially if you have paid good money for a professional calibration.
There’s no such thing as a preset that suits all content, let alone one that suits all ambient lighting situations. When you have a TV calibrated by a professional that is taken into account and often different presets are calibrated for different lighting levels, most commonly night and day. I do think that people adjust the picture settings based on what they most commonly watch, which is entirely appropriate, then they just make a compromise, not adjusting individual presets to get the best out of each type of content. I get it, I totally do and I’m guilty of it too, it’s a pain to think about engaging different presets on a regular basis depending on what you are watching. Yes, I know the same standards should apply to all content and this shouldn’t be necessary but, as I have found, some TVs don’t treat interlaced content well when set to a cinema preset, and you might want a punchier image and a bit of motion interpolation for sport.
Why can’t so called smart TVs, which is really another misnomer, include the ability to change profile automatically based on content. If I’m watching a film or high quality made for TV drama then yes, the cinema preset would be completely suitable, but if I’m watching sport or some other content then surely by now the TV should be able to auto detect that and switch to another preset and do it reliably without switching back and forth. Maybe this should be based on data read from the EPG or other available metadata? Let’s use some of this new AI processing that seems to be the buzz term at the moment to make TV processing better, after all upscaling has come on leaps and bounds in only a few years, it’s just that some other image enhancements, improperly applied, can ruin a great picture.