Originally Posted By: hybrid8
If it's written in software, if it's executed by a machine, it can be timed. If it can be timed, it's not zero. It may be close, but it's not the same thing. This isn't an abstract theory.

And while I respect Hugo's insight, it doesn't provide a complete explanation for which frame of the "always filming" is actually stored.

With regards to the camera specifically, it's also been published that the sensor can't capture full resolution as quickly as it can "HD" resolutions. Which translates to potentially many frames per second, but not infinite, not 60, and not 30. So whatever number that is is also going to play a factor here. If your release happens between capture intervals, which frame is used? The previous (if buffered) or the next?


The framerate in the viewfinder is the full res capture resolution; it is not the same as LCM refresh, which is fixed at 60Hz. When I say "zero shutter lag", I mean it to refer to it capturing the exact content of the viewfinder when the button is released.

The experience of a user seeing something they want to capture but inexplicably (to them) missing is very frustrating, and something that needed to be dealt with appropriately - in this case by ensuring the image capture and processing pipeline was seriously oversized compared to what other people shipped.

Obviously, what's on the screen will be delayed vs the light hitting the sensor, as the frame has to be sampled, stream out, de-bayer'ed and stored. Scaling is something that happens on the way to the LCM, so that's not another step.

Flash is obviously a big delay because a wimpy LED flash needs to be on for many tens of milliseconds to illuminate the scene.

Possibly I should have prefixed my comment about zero lag with the above, but even an SLR almost certainly has in the same order of "real world->capture" lag due to moving the mirror and so on.