What is 4K? Next-generation resolution explained

When it comes out this summer, the 84-inch LG 84LM9600 will be the largest LCD the market has yet seen, and one of the first with 4K resolution.

(Credit: LG)

As if LED and 3D TV weren’t confusing enough, 2012 and beyond will bring an HDTV technology called 4K. It’s being heralded as the next high-def, and manufacturers are already lining up to bring you products.

But just as was the case with 3D, it’s the hardware chicken before the software egg: there’s no consumer 4K content available. Still, if you listen to the industry, it’ll tell you it’s the last resolution you’ll ever need. So what is 4K anyway, and what makes it different from high definition?

Digital resolutions: A primer

The latest in a line of broadcast and media resolutions, 4K is due to replace 1080i/p (1,920×1,080 pixels) as the highest-resolution signal available for movies and, perhaps, television.

Though there are several different standards, “4K” in general refers to a resolution of roughly 4,000 pixels wide and about 2,000 pixels high. That makes it the equivalent of four 1080p screens in height and length. Currently 4K is a catch-all term for a number of standards that are reasonably close to that resolution, and the TVs we’ll see this year labeled 4K will actually be Quad HD, defined below. But frankly, we think 4K is the catchier name.

Meanwhile, high definition (HD) itself has been with us for about a decade and is used in Blu-ray movies and HD broadcasts. There are three versions of HD: full high definition 1080p (progressive), 1080i (interlaced), and 720p (also called simply “high definition”).

Most television programs and all DVDs are encoded in standard definition (480 lines). Standard definition is the oldest resolution still in use as it began life as NTSC broadcasts, switching to digital with the introduction of ATSC in 2007.

Four resolutions compared: standard definition; full high definition; Quad HD; and 4K/2K.(Credit: CNET)

The beginnings of digital cinema

The roots of 4K are in the theater.

When George Lucas was preparing to make his long-promised prequels to the “Star Wars” movies in the late ’90s, he was experimenting with new digital formats as a replacement for film. Film stock is incredibly expensive to produce, transport, and store. If movie houses could simply download a digital movie file and display it on a digital projector, they could save a lot of money. In a time when cinemas are under siege from on-demand cable services and streaming video, cost-cutting helps to keep them competitive.

After shooting “The Phantom Menace” partly in HD, George Lucas shot “Attack of the Clones” fully digitally in 1080p. This was great for the future Blu-ray release, but the boffins soon found that 1080p wasn’t high-enough resolution for giant theater screens. If you sit in the front rows of one of these theaters as it’s displaying 1080p content, you may see a softer image or the lattice grid of pixel structure, which can be quite distracting.

The industry needed a standard that works on the proposition that you’ll be sitting one-and-a-half times the screen height from the screen, and this required a higher resolution than 1080p. Digital Cinema Initiatives (DCI) was formed in 2002 with the goal of setting a digital standard. Based on these efforts, two new resolutions came about: a 2K specification, and later in 2005, the 4K format.

The first high-profile 4K cinema release was “Blade Runner: The Final Cut” in 2007, a new cut and print of the 1982 masterpiece. Unfortunately, at that time very few theaters were able to show it in its full resolution. It would take one of director Ridley Scott’s contemporaries to truly drive 4K into your local Cineplex.

The 4K ‘standard’