There’s a statement I’d like to make before continuing with this piece, which is that cameras should be kept separate from TVs in the 4K subject. Shooting in a larger frame than you’re intending is standard practice, both on the camera and in the editing suite. Shooting in 4K for editing in 1080p is comparable to shooting in slow motion for a regular speed scene – sure it’s more information that you need, but it allow you to do certain things that a standard 1080p 25fps won’t allow you to do without really struggling. Tracking for example becomes far easier, as editing software moves the frame to keep particular objects static in certain places on the screen – and with a larger size image, it means you can keep the 1080p image quality without sacrificing your ability to track bigger motion (obviously other factors, including shutter speed, would also be an issue – but those can be sorted on the camera itself). But what about the TV? Is it really necessary?
Been here before
Remember when DVD was first introduced? It was a leap for youngsters with really good eyesight, and the sound quality was better – all because it was a switch to digital media. Heck, one could even argue that the advent of HD (1080p, don’t give me any of that HD ready crap!) was a leap forward for larger screens. But DVD had a fairly slow adoption rate – heck even HDTVs had a fairly slow adoption rate, but speeded up thanks to flat LCD screens. Even now, we see some of the best innovations in TVs being Wi-Fi inbuilt, with apps like YouTube and TV channels own on demand services (along with Netflix). 3D TVs aren’t really getting anywhere. HD DVD versus Blu Ray – we’ve all been witness to a lot of technical innovation in TVs and optical media over the past ten years. People have become increasingly use to changes in the ways we use televisions as pieces of technology.
Where is it all leading?
One of the factors of upgrading is that you can’t just plug in old devices and have it look like new. The reason why DVDs are still being sold are that they can be used in Blu-Ray players as well. You don’t have to recollect things. Heck, many Blu-Ray players also upscale DVDs. But there’s a limit to how many pixels the human eye can distinguish. That’s why Apple came out with the “Retina Display” – pixels so tightly packed, they no longer look like pixels (from even a fairly close up distance). No doubt 4K TVs will have an audience with early adopters, but will they get much after that? And since 8K tech is now being worked on, will people just skip the minor upgrade and wait for 8K TV? How big of a leap will it take for people to upgrade their equipment? Bearing in mind that it’s not just the physical TV that needs to be upgraded – what about devices that can play 4K? What about cables? What about the content?
The problem of the Market
There’s a lot of factors involved in if a new technology will sell, including how inexpensive it is, whether its a significant enough of an upgrade, how usable it is, etc… I, personally, don’t think 4K will really take off until its relatively cheap and its got a monopoly. What do I mean? Well, if virtually every TV sold in a retailer has 4K, of course its going to succeed. Look at Windows 8. It’s sold with virtually every laptop now. It’s difficult to actually go into a store and buy a machine that doesn’t have Windows 8 already installed. HDTVs slowly did the same. Cathode Ray TVs slowly dwindled because manufacturers weren’t making them anymore. And why weren’t they? Because CRTs were heavy, bulky, they could only go up to a certain size. 4K TVs won’t have many disadvantages to fix. In fact, 4K TVs will probably be bigger, therefore bulkier and possibly heavier – not to mention a greater consumer of space.
So where should the direction of TVs be going? I would prefer to see wireless TVs. I’m not just talking Wi-Fi enabled, but ones that can connect to devices wirelessly – without the need of an aerial coming in (you’d probably need some sort of “sender” going from the aerial still), that can connect to wireless speakers. How about ones that monitor the sounds in the room to determine if the volume needs to be increased or decreased? How about refresh rates so high that you can literally have 4 or 5 people using the TV all watching different things (obviously, you’d need glasses) – but its not outside of the realms of possibility (active 3D TVs already use this with active shutter technology – although usually at 100 or 200 Hz). Obviously some form of audio split would need to happen too.
So, one of the basic fundamental things I’m discussing is that I think we should want innovation rather than graduation. Much like with the console market, I’m fine with having more processing power and better graphics, but is that all? I want new ways to experience a technology, not just a small improvement. What happens after 8K? 16K? Will it be any better? Will people be able to tell the difference? There is a demand for new technologies, but I don’t think 4K is it.