- News & Analysis
- Video & Audio
- White Papers
- Industry Reports
The display format known as 4K is moving into the consumer space. This month the Consumer Electronics Association announced an effort to rebrand 4K, the somewhat geeky label that refers to a horizontal resolution of approximately 4,000 pixels, as Ultra High Definition (HD). Ahead of CES2013 next January, several TV manufacturers are beginning to move products in this space.
Last week, LG Electronics launched retail availability of an 84-inch Ultra HD flat-panel in Los Angeles. This 7-foot (2.1 meter) wide screen, selling for $20,000, reportedly will compete against a similar TV from Sony, which is available for pre-orders at $25,000. Given the high-end pricing, it is safe to say that 4K or Ultra HD remains an expensive novelty, nowhere near the mainstream. But in other respects, consumers are already viewing 4K.
In the cinematic world, this technology has been around for years. Sony began releasing movies in 4K Digital Cinema Package (DCP) in 2004. AMC Entertainment announced its plan to install 4K projection systems in all of its theatres in the U.S. in 2009. Sony, which will have released more than 50 4K movies by 2013, has promoted 4K over legacy 2K as a way to stay ahead of the competitive home theatre. “Simply stated,” stated a 2010 Sony white paper, “2K is insufficient to position the cinema as clearly superior to HDTV.”
A moviegoer may be unaware of a film’s particular format, but what is gaining currency is the notion that there is something out there better than HD. In May 2012 cinematographer and director Tom Lowe released what he claimed was the first movie (“TimeScapes”) sold to the public as a 4K file. (See discussion at Gizmodo.) That it weighs in at 160GB and costs $300 again places 4K outside the reach of most consumers, but Pay TV operators and their technology partners are taking note all the same.
In a session devoted to encoding at the recent Cable-Tec Expo in Orlando, VP Engineering Santhana Chari of ARRIS looked at whether 4K would bring video quality improvements to a typical home viewing scenario. Using a standard visual acuity limit, Chari plotted optimal viewing distances for a given height of display and resolution of video.
What Chari’s diagram indicates is that 4K’s value is associated with both shorter distances and larger viewing screens. Using a 40-inch display height, which roughly pairs with a 75-inch diagonal length, at distances less than 11 feet, 4K stands as an improvement over 1080i HD; at greater distances, 1080i wins. But then as display heights go much above 60 inches, 4K owns the field. “The takeaway is that for 4K to make sense, the ratio of the screen height to viewing distance has to be higher,” Chari said. “People have to have bigger displays.”
While Chari said the main driver would be the big screen, it is not the only one. “It’s certainly fair to focus on an iPad kind of display in 18 months,” he said. Enterprise applications, such as video conferencing, are other potential markets.
However 4K—or Ultra HD—plays out, these formats are going to consume considerable bandwidth. (Note: some CE manufacturers are already demonstrating 8K displays.) That raises the prospect of advanced encoding schemes, such as High Efficiency Video Coding (HEVC). “The main link between (HEVC and 4K) is that when we go to 4K, the bandwidth requirements are going to be 3x to 4x higher compared to 1080, current HD delivery,” Chari said.
“So obviously it makes sense to use HEVC, because HEVC is giving twice the bandwidth improvement over H.264 and four times the improvement compared to MPEG-2,” he said.