“Movement of camera sensor by 1/2 pixel width halves acuity.”

The title of this post was just one of many facts which Professor Rosenthal imparted during our first class.  I think about this every time I consider picking up my camera.

Nikon d800 sensor

Above is the image sensor of my camera (.jpg lifted without permission from Nikon).

According to the manufacturer’s specifications it possesses 36.3 million pixels and measures 35.9 mm x 24.0 mm

In full-frame (FX) format the device employs 7,360 pixels by 4,912 pixels.

If we disregard that:
1) the image falling on the sensor is round (as in round, not rectangular optics), that about thirty percent of those pixels aren’t used (because the image from the lens isn’t falling on them)
2) there are a chunk more pixels which are cropped from that circle to make the rectangle which we eventually do see, in an ideal situation, we have:

35.9 mm wide /7360 pixels = 0.0048777173913 mm approximate distance between each horizontal sensing pixel
and
24 mm wide / 4912 pixels = 0.00488599348534 mm approximate distance between each vertical sensing pixel.

This means that movement as little as 0.00244299674267 mm in the vertical plane or 0.00243885869565 mm in the horizontal plane will degrade the acuity of my images by half.