Yeah. Most of the big fixes have been at the OS level. Like with Android’s Doze & Doze on the Go, or Apple’s integration of QOS and timer coalescing into the heart of the system. These basically have the supervisor step in and force things to behave better.

I keep hoping to learn something about power-optimized algorithms & data structures, but it seems power use is too tied to details to model in any way that would give you a “big-O-Joules” on par with the time/space models we use now. (And of course the time ones kinda suck at factoring in multi-level memories.)

/