Tony probably knows more about this than me, but I'll bet that some HDTV tuners have this upsampling bulit into them.
Correct. In fact, I think pretty much all of them do. So you don't necessarily need this on your set. But if the set has a better upsampler than the HDTV tuner, then you can put the tuner into what's called "native" mode (where it just sends out whatever it gets in, verbatim, without doing up- or down-sampling) and let the set do the scan-rate conversion.

Since 720p and 1080i are so completely different in terms of the way the images are displayed, there are only two ways to display both on the same screen:

1. Leave the screen in the same mode and digitally mangle the image so it fits the opposite screen format.

2. Change the scan rate of the CRT raster to fit the image type.

Obviously, the second option is better, but it's only do-able on certain expensive CRT's, and not possible at all on plasma/DLT/LCD screens.

My rear projector CRT TV can only display 1080i or 540p, it can't do 720p. It takes an expensive CRT to do 720p. (For those in the know, 1080i is equally "hard" to do, raster-electronics-wise, as 540p. To do 720p the raster has to hit a higher frequency.)

But my TV also doesn't have any scan-rate conversion or upsampling built-in, so I can't use my HDTV decoder in "native" mode. I have to set it to forced-1080i mode, and its upsampling SUCKS. Blurry and washed out. So any signals less than 1080i (such as 720p broadcasts or regular 480i TV shows that I choose to route through the HDTV decoder) look like crap.

That's probably more information than you needed to know, but what it comes down to is that if the TV has a particularly good 720->1080 converter algorithm, it might be better than the one in your HDTV decoder, so try both and pick the one you like better. Depending on your equipment one might be better than the other. Or maybe vice-versa.
_________________________
Tony Fabris