It basically boils down to a simple law of physics: the greater the difference in temperature between inside and out, the faster heat will move between the two.
Therefore setting the thermostat for a lessor difference in temperature (between inside and out) will slow down the transfer (loss or gain) of heat, saving energy by reducing the need for the heater (or AC cooler) to be run to make up for the transfer.
Thanks, but that's not the part of the equation that's puzzling us. We figured it takes less energy to heat a house to a lower temperature.
The part I'm wondering about is the decision to lower the temperature while out, meaning that when you get back the furnace is going to have to run for a longer time to heat a house back up to a comfortable level that it does to maintain a temperature*. The question is: does that energy it needs to bring the temperature back up off-set the energy saved by not having to heat the house to as great a degree, or would it be about the same to simply leave the heat on at the same temperature.
I'm assuming the answer is "no," so I continue to turn the heat down when we leave, but I'd like to wrap my head around a concrete explanation of why.