One question about the sampling process when converting an analog image to digital. I understand about megapixels and stuff, and how the megapixel myth has served the digital camera industry well. And I already understand that a bigger sensor is better (that's why a point-and-shoot 14MP camera can be inferior in quality compared to say, a 12MP DSLR - if one with such an MP count exists)....
I've been reading about how a CCD and CMOS digitize images (say in a digital camera). The overall concept is quite clear. The only thing that I'm still confused about, although might be a simple thing, is... whether the number of sensors (on a sensor array) translates to the number of pixels in the final image. Is this the case? Meaning, say that the sensor array is 10 X 10 (yes, very small but just as an example). But does this mean that a 10 X 10 sample is taken as well? Which finally translates to a digital image with a 10px X 10px "spatial resolution"? (Am using "" as I've read that there are many definition, even for spatial resolution)
I thought of asking this question, after reading this part in Gonzalez and Woods (Plz refer to the picture)
Would appreciate any feedback :D Thanks!

The final size in the image would be:
where
D: The object size in the image in pixels
X: The size of the object (mm)
Z: The distance to the object (mm)
f: The focal length of the lens (mm)
p: Pixel size of the sensor (mm/pixel) - you can find this in the sensor's spec sheet.