Latest in Features

Image credit:

The Clicker -- Why are there 18 standards for HDTV?

Peter Rojas
Remember how a few weeks ago we said we were looking for new feature columnists? Well, we're happy to introduce The Clicker, a new weekly column from Stephen Speicher about something we all can relate to: television.

Hello and welcome to "The Clicker," Engadget's weekly wandering into the granddaddy of all gadgets – the television.

Have a seat on the couch. Kick back, relax. Just remember the rules: keep your shoes off the coffee table and keep your grubby little hands off my remote. Hold on; that sounded a bit hostile. You'll have to forgive me. I get a little possessive of my remote. Don't worry – I am currently seeing a therapist about this. Well… this and my fits of blind rage upon discovering that CSI has been deleted from the TiVo in order to make room for Emeril Lagasse!

Besides, I%uFFFDm sure that you, as Engadget readers, can understand the Zen-like relationship that a man has with both his remote control and his TiVo. If you%uFFFDre anything like me, you%uFFFDve learned to scan through the channels with the best of them. You too have developed a Jedi-like ability to quickly skip over Lifetime, Oxygen, and The Hallmark Channel while spending a little more time on ESPN and the Sci-Fi Channel. Basically, your hand feels naked without the remote.

But I digress.

Having spent some time in the cable TV world, I%uFFFDm often asked questions about television, HDTV, cable, etc. %uFFFDThe Clicker%uFFFD is my weekly opportunity to answer these questions. If you too have a question, email me at

This week%uFFFDs question comes from Bill Jordon. He asks, %uFFFDWhy are there 18 different standards for HDTV? Why do some stations (like ESPN-HD) use 720p and other stations (like HBOHD) use 1080i?%uFFFD

First, to be technical, there are eighteen different %uFFFDATSC DTV (digital television) formats%uFFFD of which only six are actually High-Definition. Of the six HDTV formats only two are used frequently, 720p and 1080i. That doesn%uFFFDt really answer the question, but it narrows the field down a bit and gives us something to work with.

The answer comes down to what type of content the broadcaster is looking to optimize. We all know that 1080i has the higher resolution, so why bother offering another format like 720p? While it%uFFFDs true that 1080i has a greater number of pixels (1920 x 1080 vs. 1280 x 720), 720 has two things working to its advantage. First, 720p is a progressive signal. Second, 720p is 60 fps (frames per second). 1080i, on the other hand, is interlaced and 30 fps (60 fields per second).

Where does this matter? It matters for fast movement (e.g. sports). Let%uFFFDs look at an example using both 720p and 1080i:

Suppose that we have a tennis ball moving across the screen for 1 second. A broadcast in 720p is going to show 60 complete images of the tennis ball. Think of it like an old-fashioned flipbook that has 60 pages. Each page will have a complete image and when you quickly flip through the entire book it will give you movement. This is much like how traditional film works (albeit with 24 fps).

If we were to do the same experiment with 1080i, it would be quite different. Unlike progressive formats, which show the whole picture, interlaced material relies on the fact that two half-pictures will generally combine to make one whole picture. As such, 1080i will display the even lines for 1/60th of a second followed by the odd lines for 1/60th of second. If we return to the flipbook example, we can see that the book will still have 60 pages but each page will look a little like we%uFFFDre looking through mini-blinds. Of course, when it%uFFFDs sped up it doesn%uFFFDt look like this. A combination of the display (afterglow) and the mind combine to complete the picture.

%uFFFDBut can%uFFFDt you de-interlace a 1080i signal and have the best of both worlds?%uFFFD (De-interlacing is the process of converting an interlaced signal into a progressive signal by combining the even lines and the odd lines to form one solid picture) Yes, but even in the best case you are only getting 30 fps (half the frames of 720p). In the worst case, the even lines and the odd lines don%uFFFDt quite match up. For instance, assume that the camera is capturing half the picture every 1/60th of a second. In that case, it%uFFFDs possible that the ball has moved enough in that short amount of time that the odd lines don%uFFFDt align with the even lines.

In either case, 720p has the potential to deliver a smoother, more stable picture when dealing with fast motion.

It%uFFFDs easy to see why ESPN-HD chose 720p as its standard. Likewise, as ABC and ESPN are owned by the same parent company, one can see why ABC wanted to be inline with its sister network.

%uFFFDSo, if 720p delivers a smoother picture, why not just use it?%uFFFD The answer is resolution. 1080i has many more pixels and often you don%uFFFDt need the extra frames. The majority of the content being shown today was first shot on film. Most TV dramas and nearly all movies start their lives as film. Since film is 24fps, 1080i%uFFFDs 30 fps is more than enough to capture all the frames. The result is that broadcasters who feel that film-based material is their bread and butter will often choose 1080i.

There are, of course, other arguments for each of the technologies (e.g. bandwidth, remote cameras, matching digital TVs, etc.), and, like any good format war, proponents on each side view their side as the only side.

Until next week, save my seat!

From around the web

ear iconeye icontext filevr