Advertisement

The (in)famous 1080p truth

This article is making the rounds on the internet so we may as well throw in our two cents. I think there is a lot of confusion and misinformation about HDTV in general and 1080p specifically, and this piece in particular has a lot of both swirling around it.

First, let's address the section I've seen quoted most often in forums (usually misinterpreted or presented alone with no context). "How about Blu-ray and HD-DVD? If either format is used to store and play back live HD content, it will have to be 1920x1080i (interlaced again) to be compatible with the bulk of consumer TVs. And any progressive-scan content will also have to be interlaced for viewing on the majority of HDTV sets."

What does that mean? If you have a player that outputs only in 1080i (like the HD-DVD player Ben reviewed) and/or a 1080i TV, you will see ... you guessed it 1080i. Not incredibly complicated or shocking, just something I've seen people go into a frenzy over and suddenly believe 1080p is as real as the boogie man, Easter Bunny or gas that costs less than $2.75 per gallon. Trust me, 1080p is real, but you have to be careful about what you're getting.

As it stands, this piece is at best half done, definitely outdated and somewhat inaccurate.


It tells us that there is not and likely never will be a 1080p broadcast standard. That's fine. It tells us that even movies encoded on Blu-ray or HD-DVD discs at 1080p, will be played at no better than 1080i on most commercially available televisions. Can't argue with that. Afterwards we get a long diatribe about the whys and wherefores many "1080p" TVs don't accept a 1080p input. We hear you buddy.

It never once mentions a situation where a 1080p television can do with a true 1080p source or that if your television does have a good scaler, that it may be able to take that less-than-1080p source and display it on a large screen better than a native 720p TV would due to the greater pixel density. If you read carefully, you can tell from the last line, that this is really a warning about some so-called 1080p televisions, not the avoid-like-the-plague distress signal it has at times been made out to be. Even TVs that don't accept the highest resolution as their input can still provide an excellent picture like the SXRD Matt reviewed.

What does all this mean? A 1080p television or projector might be right for you, but check to see what's really in it first and what it is doing to your signal. Nearly everything you watch will need to be upconverted (until you get a Blu-ray movie player or wait until HD-DVD players add support for output at that resolution), so take a look at the TV while it is displaying content from the source and type you will likely be watching most to see how it handles it. This isn't Blu-ray/HD-DVD/PS3/Xbox 360 fanboy fodder, for the most part people agree that your high definition content looks better on the 1080p TVs that are available.

It's just like any new technology, look very carefully before jumping into the first generation, or else you may be better off saving a bit of money and going with older more stabilized technology. However if you're willing to do the research and maybe spend a little extra, you can enjoy the benefits of the most cutting edge technology without having to make compromises.

Sources for more information:

AVSForum opinions
Sound and Vision Magazine on the challenges of deinterlacing
Interesting discussion on bob-and-weave deinterlacing

[Thanks to all who submitted this]