Originally Posted By: wfaulk
The CPU is likely to consume a given amount of power based on usage, and it's easy enough to find out which applications used which amount of CPU time.

Not everything you want to know is necessarily instrumented, though. For instance, 1 CPU second of work on a dataset that fits in on-chip cache is going to use a lot less power than 1 CPU second of work that misses the cache and spends all its time waggling external pins that head off to the DRAM. And that's exactly the sort of thing that a large codebase ported from desktop CPUs, which is presumably what Flash is, won't have been optimised for.

Peter