I think people substantially underestimate how hard the iOS GUI pushes the performance wall, from one release to the next.
There was a talk at WWDC this year (session 419, if you want to watch it) that goes into some detail about how the blur effects on e.g. the lock screen, notification center, control center, toolbars, etc. work.
You can watch the talk for the details, but as a practical matter several of these effects involve seven (!) full GPU passes, just for these effects alone, let alone whatever UI is under or over them. And not even the very latest-gen devices can do it consistently full-screen at 60fps. iOS displays motion underneath blur effects in very small areas (e.g. a toolbar) on new devices but if you think about it, you never see motion underneath, say, the lock screen. The reason is that not even the latest iPad Air has the GPU to apply some fullscreen blurs in enough time to meet the draw deadline.
This actually surprises a lot of people--I can't count the number of designs I've gotten where some designer has made a large blurred UI element and expects something to move underneath it. Which, depending on the size of the element and the target hardware you can sometimes get away with, but you are in for quite the lengthy performance test, and a complicated spectrum of fallbacks for all the devices that just aren't fast enough.
And that's just one blur effect. You take a look at all the other "new" iOS things that are less visible, like the endless stream of new background technologies (background fetch, silent push, NSURLSession, multi-app audio routing, various iCloud syncing stuff) the new bluetooth stuff (iBeacon, HomeKit, Multipeer), turning WebKit into an LLVM compiler, and all the other things that are going on in iOS it really is "doing more of everything". I remember a time when iOS was basically just a few OS processes plus 1 active app, and as we speak ps on my phone lists 87 processes.
Yes. The UI equivalent of automotive tail fins is expensive. That's the sole point of it.
At some point in the "near" future it'll be energetically a net positive to snap a pix on the cam, analyze the eye pupils and accelerometers and recent eyeball movements, and if you're not looking at that part of the screen, its not going to waste energy displaying it. Could even see this happening with backlighting. Watching other people use their phones is likely to get very distracting for 3rd parties.
There was a talk at WWDC this year (session 419, if you want to watch it) that goes into some detail about how the blur effects on e.g. the lock screen, notification center, control center, toolbars, etc. work.
You can watch the talk for the details, but as a practical matter several of these effects involve seven (!) full GPU passes, just for these effects alone, let alone whatever UI is under or over them. And not even the very latest-gen devices can do it consistently full-screen at 60fps. iOS displays motion underneath blur effects in very small areas (e.g. a toolbar) on new devices but if you think about it, you never see motion underneath, say, the lock screen. The reason is that not even the latest iPad Air has the GPU to apply some fullscreen blurs in enough time to meet the draw deadline.
This actually surprises a lot of people--I can't count the number of designs I've gotten where some designer has made a large blurred UI element and expects something to move underneath it. Which, depending on the size of the element and the target hardware you can sometimes get away with, but you are in for quite the lengthy performance test, and a complicated spectrum of fallbacks for all the devices that just aren't fast enough.
And that's just one blur effect. You take a look at all the other "new" iOS things that are less visible, like the endless stream of new background technologies (background fetch, silent push, NSURLSession, multi-app audio routing, various iCloud syncing stuff) the new bluetooth stuff (iBeacon, HomeKit, Multipeer), turning WebKit into an LLVM compiler, and all the other things that are going on in iOS it really is "doing more of everything". I remember a time when iOS was basically just a few OS processes plus 1 active app, and as we speak ps on my phone lists 87 processes.