Television flickers horribly. You can verify this by running your computer video output to your TV. The reason why it does not appear to flicker is the 1) elements on screen is typically in motion 2) television is usually presented field by field.

On my point#1, you can verify this by running your computer output to your tv. I remember on my old Amiga, running 640x400 interlaced out to the TV will cause all sorts of flickering problems. You can verify this from a PC as well. Static objects *will* flicker.

point#2 -- the first and second passes of the interlace is typically considered as two separate "frames" (known as fields in the industry) -- so consider than it takes 1/60 of a second to draw the first and second fields. However, it only takes about 1/30 of a second to draw the first field and 1/30 to draw the second field. So rather than "wasting" the 1/30 cycle and allowing possible phospher fade, the next frame goes into the second interlace pass (the second field). So basically, you lose HALF the resolution and gain twice the refresh. This is a very common practice btw.

There are TVs out there that have high phospher persistence and only ~200 lines of resolution. These are the cheapy tv's, and are actually adequate for much content out there...

keep in mind though, this doesn't apply to DVDs. DVDs output many more vertical lines of resolution than broadcast, and are typically full frames and not field based.

The more expensive TVs out there have low persistence phospher to keep pixels from hanging around too long and fuzzing things up. However, manufacturers can increase the scan frequency using electronic magic, implement line doublers, progressive scanning, etc, to create miraculous improvements over straight interlacing.