"While DirecTV has the most HD channels right now, is there really a noticeable difference in the quality of their HD vs. the HD I would get from my local cable provider? I've read a lot about "HD Lite" and how the satellite providers have to compress the data – but is there really that big of a difference visually? Am I not getting a real 1080i/720p picture?"
Lets get something straight right out of the box, all HD is compressed -- even HD DVD and Blu-ray -- and the how much is it compressed answer, doesn't tell the whole story. What really matters is if can you tell it's compressed and the most obvious signs are when you can see those blocks (sometimes referred to as macroblocking or pixelation) that've been made famous by internet video sites such as YouTube. With that out of the way. we'll cut to the chase before getting into the nitty-gritty. The answer isn't black and white, and no matter what anyone tells you, no one can really claim the best HD quality. Ultimately the only thing that is important is what looks good to you, but we're not going to let that stop us from analyzing the data.
National HD Channels
Yes Chris there can be a noticeable difference in the quality of providers, but not always. As much as they don't want to admit it they're all just bit providers carrying the content from the networks to our homes, and because everyone gets ESPN HD from the same place -- for example -- it can only look so good. So even on a provider like Verizon FiOS who prides itself on passing on the original signal, it can still look bad. It has been confirmed that FiOS passes the signal untouched, but at the same time it has also been observed that HD feeds such as National Geographic HD, have seen reductions in bit rates recently. Although no one knows the reason for sure, the theory is that it is at the request of the cable companies. This sounds great to big cable, because not only does it save them a few bits, but it also ensures that competing services won't have a better quality feed. Unfortunately it's not the norm to pass the signal on untouched, in fact many providers choose to compress the signal even further in order to raise the efficiency of their available throughput.
Because ATSC provides up to 19.3 Mbps to your home, combined with the super high bit rate feed from the networks (like CBS's feed is is over 30Mbps) local networks have always provided some of the best quality HD around -- before HD DVD and Blu-ray anyways. But unfortunately because of multicasting, it is also the source of some of the worst HD compression artifacts you've ever seen. What happens is a local station makes the decision -- usually business related -- to broadcast more than one channel in its alloted space, this effectively splits the 19.3 Mbps channel up into multiple channels and leaves far less bits for the HD feed and thus can produce macroblocking like you've never seen before. The sad part is that most of the time 1080i uses no-where near all the throughput and if the latest statistical multiplexing technology was utilized and priority was giving to the main feed, we wouldn't even be able to tell. So until all the local station engineers figure out how to use the new encoders, some of us are stuck watching football with squares and dancing grass. There are some who will argue that the real reason some networks choose 720p over 1080i had nothing to do with frame rate and is instead because of the fact that 720p has significantly lower throughput requirements. We love 1080 as much as the next person, but we'll gladly take 720p if that's what it takes to ensure there are no compression artifacts. Finally your provider receives your local broadcaster's signal either via a dedicated uplink or an OTA antenna -- crazy right. Some providers simply modulate the signal on the QAM network, while others choose to re-compress on the fly to save a few bits beforehand. Regardless of how much the bandwidth is reduced, re-encoding introduces another opportunity for the single to be mucked up.
MPEG-4 isn't all good
It's true that MPEG-4 is a much more efficient codec than MPEG-2, but because every local broadcaster and just about every national channel is transmitted in the older codec, we live in an MPEG-2 world. This means that when a provider makes the switch to MPEG-4, it has to buy encoders that will re-encode the signal in real time. For the most part this doesn't cause any noticeable problems, but any compression artifacts in the source feed can be exaggerated or prolonged and the option of passing on an un-touched feed is completely out -- again, a chance to muck it up. This doesn't necessarily mean the signal will look worse than the MPEG-2 source, but it most certainly can't look better. Some networks like HBO have announced plans to deliver its feed to providers in MPEG-4, but it'll be a long time before this is the norm.
So as much as it pains us to say it, it really depends on your market. It depends how good your locals are and how much of a pinch your provider is in for bandwidth. As the analog channels go away, new satellites are launched and new technology like SDV and MPEG-4 are more widespread, the quality should get better. But as long as most of Americas prefer quantity over quality, and the complaints are low enough to make business sense, we'll continue to see some ugly HD. As for which is the best, FiOS is the only provider who we would feel comfortable saying is any better than anyone else only because it doesn't re-encode the original signal to save bandwidth, but with its pitiful list of HD channels, it is hardly the HD leader -- not to mention only available in a few select parts of the country.
Got a burning question that you'd love to toss out for Engadget HD (or its readers) to take a look at? Tired of Google's blank stares when you ask for real-world experiences? Hit us up at ask at engadgethd dawt com and keep an eye on this space -- your inquiry could be next.