I have been under the impression that speakers need for their amps to supply at least their minimum power requirements. If they don't, then they can get severely clipped waves and end up blowing the speakers. This impression also led me to believe that you can never have an amp too powerful for speakers. If the speakers don't require that much power, they won't draw that much. If I'm totally off base, someone tell me, but this is the information I think I gleaned from past postings on the subject.

But if your speakers and amps are both of higher power ratings, then, all other things being equal (which isn't really going to happen IRL), you'll get better fidelity.

The way I've been thinking about this is that you can plug a 40W light bulb into an outlet, but you can plug a 1400W space heater into the same outlet. In both cases, the outlet doesn't determine the power, but, rather, the thing we'd intuitively think of as being passive. But it makes sense. The value of the resistor is what's changing, not the voltage from the outlet. In the case of speakers, the voltage changes, too, which is what makes the sound vary, but, if it was playing a static sine wave, the voltage wouldn't vary. Then it devolves into a more simple scenario; you can just think of it as an outlet (the amp) and a light bulb (the speaker). In that case, the wattage becomes another way of thinking of the current capacity of the circuit. In your house, that would be the fuse (and wiring). In the case of the amp, it's telling you that you can't accurately reproduce signals with that much current. If it tries, it'll fail and produce square waves.

Again, though, I could be way off base.
_________________________
Bitt Faulk