Before we get into anything else, please note that this is the second part of my review of the new Apple TV 4K. The first part, focusing on the new box’s strengths, is found here – and in the interests of fairness and balance, I suggest that you read that article first if you haven’t already done so. Not least because Apple really has delivered some big – if in some cases long overdue – steps forward with its latest Apple TV box.
That said, it hasn’t exactly proved a stretch to find as many problems with the Apple TV 4K as there are strengths – and some of these issues really are substantial, particularly if you’re the sort of AV fan most likely to be excited by the Apple TV 4K’s new high-end picture quality features.
With a starting price of $179 (£179) for a 32GB version that rises to $199 (£199) if you fancy 64GB of storage, the Apple TV 4K is seriously expensive by video streamer standards.
Amazon’s 4K-capable Fire TV costs just $89.99 by comparison; Roku’s 4K-capable Ultra 4K streamer costs $129; Google’s Chromecast Ultra costs just $69 (£69); and even the Nvidia Shield now only costs $179 – and that’s pretty much a gaming PC as well as a 4K HDR video streamer.
None of this is surprising, of course; Apple isn’t exactly renowned for knowingly under-pricing its products.
It is important to add on this occasion, though, that as discussed in the ‘positive’ companion piece to this article, Apple TV 4K arguably justifies its up-front cost more than any previous Apple TV generation thanks to its ground-breakingly affordable iTunes 4K movie pricing.
2) You don’t have a 4K TV
Just buying a 4K-capable source does not magically make an HD TV play 4K. This will be obvious to some readers, but I know from experience that it’s quite common for consumers to not understand that 4K sources only work properly with screens that have a matching native 4K resolution. The same goes for high dynamic range.
If you don’t have a 4K TV yet but expect to get one soon, then obviously the Apple TV 4K is worth considering. If, however, you’re perfectly satisfied with your HD TV and expect to stay that way for some time yet, the Apple TV 4K doesn’t offer enough compelling new features beyond its 4K/HDR support to justify its extra cost over and above the still-available and cheaper 4th-gen, HD-only Apple TV.
3) Outputting every source in the same video format is a bit nuts
Uniquely, the Apple TV 4K outputs all of its sources – games, video streams, your photo and video collection… everything – in exactly the same video format. This format is auto-determined by the Apple TV 4K box during initial set up by ‘reading’ the capabilities of your TV via the HDMI connection.
In the vast majority of cases this initial set up will be set to 4K Dolby Vision 60Hz, 4K HDR10 50/60Hz, or 4K HDR10 30Hz, depending on whether your TV supports Dolby Vision and what 4K HDR frame rate your TV’s HDMI input – and cable! – can support.
As noted in the positive-minded companion article to this one, there are actually some understandable user-experience reasons for Apple adopting such a ‘one video output format suits all’ policy. Unfortunately, though, Apple’s approach also creates major picture playback issues that will drive AV enthusiasts nuts and sometimes frustrate even the most casual users.
For instance, if you play a 24-frames a second high definition, standard dynamic range source through the Apple TV 4K box but the output is set to 4K HDR 60Hz, then Apple will use processing to convert the source image to 4K HDR 60Hz for output. The color range will be expanded, the image’s brightness range will be stretched, detail will be added according to Apple’s image calculations, and the frame rate will be adjusted – presumably by frame interpolation.
It will come as no surprise to AV enthusiasts to learn that the results of this level of processing interference routinely look horrible. Colors look unnatural, light levels look forced, and elevating the brightness levels of what can be quite compressed images when you’re talking about streamed HD sources can make noise issues such as fizzing edges and blocking become easier to see.
Such ugly picture results are hardly the sort of thing you’d expect to see from a device that Apple claims to be a revolution in picture quality.
To be fair, some conversion circumstances work OK. For instance, Dolby Vision source content still looks clean if you only have an HDR10 TV; just not quite as rich in color and peak light detail.
This may be because the Apple TV is simply extracting a genuine HDR10 ‘core’ from the Dolby Vision feed rather than using its own internal processing to somehow deliver a Dolby Vision to HDR10 conversion. I’ve asked Apple to clarify what’s happening in this circumstance and will update this article if they answer.
A little video noise, however, seems to be added to the image in a situation where the Apple TV 4K has to convert a 4K film delivered in the industry HDR10 HDR standard (such as Alien: Covenant on iTunes) to Dolby Vision to feed in to a Dolby Vision-capable TV.
Actually, I’m struggling to understand how such a conversion is even possible. Is the Apple TV inventing on the fly the sort of dynamic metadata that defines Dolby Vision and adding it to the HDR10 feed? And can such an ‘invented’ Dolby Vision signal really meet the standards Dolby usually expects to be associated with a Dolby Vision experience? It’s baffling, honestly.
Apple’s conversion processes also seem to deliver rather variable results with different apps – and even with different content from the same app in Netflix’s case.
Interestingly the most generally successful conversion results are seen with Apple’s own iTunes platform, suggesting that Apple has been able to work more closely with its own content delivery system than it has with third party platforms when developing its 4K, frame rate and HDR conversion calculations.
Even where Apple’s conversion systems work reasonably well, though, the bottom line for many AV enthusiasts will be that all too often, when you’re watching something through the Apple TV 4K, you’re simply not seeing the content looking as it was designed to look.
Thankfully Apple does provide an extensive set of alternative video output settings you can choose manually. These include pretty much every combination of frame rate, HDR and standard dynamic range format you could think of.
As a result, the Apple TV 4K does at least provide the facility to output different sources in their pure, native formats (assuming, that is, that anything is ever sent directly to the Apple TV 4K’s HDMI output without first being run through Apple’s video processing system…). However, does anyone, especially more mainstream users, really want to go through the hassle of constantly changing the Apple TV 4K’s video output to suit different sources?
It is worth remembering, of course, that if a 4K TV is fed a less-than-4K content source, it will have to upscale it to its screen resolution – no 4K TV will play an HD source in a completely pure form. If you have a high-end 4K TV, though, you may well prefer to trust the quality of that TV’s upscaling processing to that of the Apple TV 4K. Plus, of course, unlike the Apple TV 4K, 4K TVs don’t generally also automatically apply HDR conversions or frame-rate upgrades.
In the end, it seems almost perverse of Apple not to provide at least the option for the Apple TV 4K to automatically adapt its video output to that of the source it’s playing, so that your TV always receives a native source image.
Not everyone would use such an option, preferring to stick with a slicker operating experience – and that’s fine. But the option should still be there.
So obvious is this, in fact, that I find it almost impossible to believe Apple won’t add an automatic switching option via a future firmware update. Unless, as suggested earlier, the box is designed so that all video always has to run through the deepest recesses of Apple’s video processing engine…