John, this is easily one of the most common questions we get. We have covered it before and, well, honestly we don't mind covering it again. This time around though we will break it down nice and easy for not only you but the hundreds that have the same question: How can you tell the difference between 1080i and 1080p? You probably can't.

We are serious.They both have the same amount of data being displayed, but the 'i' stands for interlaced. You see, a interlaced picture displays the picture by showing the odd number of horizontal lines and then the even numbered ones on the screen at a rate of 1/30 of a second. The 'p' type stands for progressive that displays all the info at the same time. So is there a difference - yes; can you see it - probably not anymore.

The main reason you cannot tell comes from the type of HDTVs that are on the market these days. LCDs, DLPs, SXRD/DILA/LCoS, and plasmas are always a progressive type TV. Only CRTs can properly display an interlaced signal. The other type of displays will take that incoming interlaced signal and display it in a progressive type resolution like 720p or 1080p. There is of course a good amount of technical stuff behind that (see Deinterlacing and telecine) but we promised John and everyone else that we will keep it nice and easy.

[Thanks for the question John!]

AMD buying ATI for $5.4 billion