I'm used to thinking "the more watts the better", but people have told me I have to think about amps too.
Perhaps I am missing something here, but since watts = volts * amps, and the output of a power supply is always the same number of volts (5 or 12 I believe, depending on what it is powering), than more watts does = more amps.
The point is that, while a PSU may provide (say) 450W altogether, it's still not much use if its 12V output is at 425W and its other outputs (5, 3.3) only add up to 25W. Dignan's advisers were asking him to check whether the amperage on each individual voltage line was sufficient for his system. (AIUI, hard disks and fans need 12V amperage, general PCI cards and logic 5V, and CPUs and AGP cards 3.3V, but I'm prepared to be corrected on that.)
So, for instance, if you have two CPUs at 50W each, you'll need at least 100W available on the 3.3V line, which is 33A.
Peter