Take the pulse of CES 2013 and you’ll find the usual gizmo lineup: smartphones, tablets, cameras, cars and high-end audio products. But this year, the show’s opening salvo involved something the industry refers to as “4K ultra-high-definition television.” These are towering TV screens — over 80 inches in a few cases — with several times the resolution of today’s top-end 1080p (1920 x 1080) sets: 3840 × 2160 pixels, even more than something like Apple’s groundbreaking Retina MacBook Pro (2880 x 1880).
(Note 3840 x 2160 is technically short of 4K by 160 lines — marketing always rounds up.)
Manufacturers are positioning this sort of pixel brawn to compensate for two things: diminishing visual returns as screens scale up, size-wise, and the lack of interest in 3D technology. According to NPD DisplaySearch, global TV shipments declined 6% in 2012, and the market’s expected to remain flat through 2013 (to be fair, NPD chalks this up more to “economic uncertainty” and slow declines in TV prices). Companies like Sony, Samsung, Toshiba and LG thus seem to be treating 4K TV as “plan B” in their bid to reinvigorate a stagnant market.
I have zero interest in 3D technology, but I’ve long been mindful of display resolutions and screen sizes. Play games or movies on 60, 70 or 80-inch screens at today’s idea of “high-definition” and you really notice the pixels, especially sitting close to the screen. You can really see it in games, where even the top engines still render well short of reality-caliber visuals and three-dimensional objects start to look aliased and blocky on bigger screens.
4K TVs in theory ameliorate this by increasing the horizontal and vertical line counts to give you the experience at 80 inches, say, that you might have at 1080p on a smaller set. No one actually sits up close to an 80-inch screen for the same reasons we call comparable viewing positions in a movie theater “nosebleeds,” but if you had to, the idea with 4K TV is that you’d be far less likely to notice the individual pixels (around eight million) than you would, say, at 1080p (around two million) on the same size screen.
But these new 4K sets also carry whopping five-figure price tags: upwards of $20,000 in most cases. Take Samsung’s new S9000 series, with an 85-inch model that’ll reportedly go for about $20,000 (there’s a 110-inch model in the offing, too), or Sony’s 84-incher, which Sony plans to sell for $25,000.
Those are unfathomable numbers for most buyers (the only two things I’ve ever paid more than $20,000 for at once were the family car and my house). It’s supposed to be a consumer electronics show, after all. The median household income of a U.S. family with 2.6 members (averaged) between 2007 and 2011, according to the U.S. Census Bureau, was $52,762. We’re talking TVs that cost nearly half that.
(According to Forbes, Sony plans to launch 60- and 55-inch models later this year at prices described as “attainable.” What that means is anyone’s guess. I have yet to encounter a company whose idea of “attainable” matched my own.)
But more important than pricing is whether you’d see any benefit from a 4K TV in any case, say you could have one for next to nothing today. What’ll you watch on that gorgeous 4K set if no one’s offering content at native 4K quality? You can’t watch 4K TV, can’t buy/rent 4K optical discs and forget about streaming it. In short, all we have are hypotheticals about how upscaled content will look plus a lot of wishful thinking about a 4K content future that hasn’t arrived yet.
Plug an Xbox 360 or PlayStation 3 into one of these things and I can tell you what they’ll look like: bigger, obviously, with all the interpolative pitfalls you’d expect from much lower-res content, since most Xbox 360 and PS3 games output at 720p. You’re talking a roughly eightfold increase, pixel-wise, from 720p to 4K. There’s no upscaler in the world that’ll compensate for that sort of disparity.
Upscaling’s a crapshoot anyway. Scan videophile forums and you’ll encounter endless debates about which televisions or consoles upscale best. The only certifiable takeaway is that in all instances, you’re talking about compromise.
Speaking of, we’re still living in a world flush with standard-definition hardware. Run a 480p Wii or PlayStation 2 game through a 720p or 1080p native television and the image is going to look imprecise on any model, the edges of objects or text blurred because the pixels have to be stretched to accommodate the LCD’s much higher native resolution.
I’m picky enough about scaling that I have a 22-inch 720p native Sony LCD TV in my office. It’s the set I do most of my gaming on, despite a higher-end 37-inch 1080p native LG LCD TV in the living room. Games look notably better on the 22-inch Sony, in part because the viewing space is smaller, in part because most current-gen console games won’t output at more than 720p. When gaming, I prefer my output-display ratio to be 1:1.
What about next-gen hardware, possibly arriving later this year? I can’t imagine Microsoft’s next Xbox or Sony’s next PlayStation outputting content natively at 3840 x 2160 pixels. Even the highest-end PCs with the fastest GPUs struggle to hit 30 frames per second with all the bells and whistles enabled at 2560 x 1600. I’d be surprised if the developers who’ve fiddled with next-gen hardware plan to do anything over 1080p. There’s plenty to worry about these days in game design, production-wise, without having to think about boosting your pixel-pushing prowess fourfold.
Sure, games running natively at 4K on 80-inch-plus screens might look stunning — incredibly detailed and ginormous. But we know that’s not going to happen for years, and even if video content crystallizes in the coming years, we’re probably at least a generation away from serious 4K game hardware.
MORE: Check out TIME Tech’s complete coverage of the 2013 Consumer Electronics Show