Quick background:

Back in the bad old days, your analog cathode-ray-tube TV would sweep left to right, top to bottom, painting the images on the screen. They wanted 60 frames per second, but they didn't have enough bandwidth, so they did 60 "fields" per second, where one field was the even lines and the other was the odd lines (Europe was 50 fields per second...). There was (barely) enough persistence in the phosphors on the screen that the even-line would still be giving off light while the adjacent odd-line was being painted. There was enough fuzziness in the electron-beam that it didn't really matter.

Fast forward to HDTV. The HDTV boffins defined two different HD "standards": 720p and 1080i. 1080i has more than 2x the vertical scan lines of standard def, and way more pixels horizontally, but is still interlaced. This was because it was cheaper to hack a CRT-based TV to do 60 fields per second of 540 lines than to do 60 full frames per second of 720 lines. For plasma/LCD/etc., it was cheaper to do progressive scan at a lower resolution. Thus, the 1080i vs. 720p dichotomy was born. Nobody expected that CRT would just flat out go away so quickly, otherwise 1080i would never have happened.

1080p, of course, has its own variants. There's 1080p/24, which is 24 frames per second -- movie mode. Then there's 1080p/60, which is pretty much the ultimate.