Sensor Resolution
In my previous two posts on this subject HERE and HERE I’ve been looking at pixel resolution as it pertains to digital display and print, and the basics of how we can manipulate it to our benefit.
You should also by aware by now that I’m not the worlds biggest fan of high sensor resolution 35mm format dSLRs – there’s nothing wrong with mega pixels; you can’t have enough of them in my book!
BUT, there’s a limit to how many you can cram into a 36 x 24 millimeter sensor area before things start getting silly and your photographic life gets harder.
So in this post I want to explain the reasoning behind my thoughts.
But before I get into that I want to address something else to do with resolution – the standard by which we judge everything we see around us – the resolution of the eye.
Human Eye – How Much Can We See?
In very simple terms, because I’m not an optician, the answer goes like this.
Someone with what some call 20/20/20 vision – 20/20 vision in a 20 year old – has a visual acuity of 5 line pairs per millimeter at a distance of 25 centimeters.
What’s a line pair?
Under ideal viewing conditions in terms of brightness and contrast the human eye can at best resolve 0.1mm detail at a distance of 25 centimeters.
Drop the brightness and the contrast and black will become less black and more grey, and white will become greyer; the contrast between light and dark becomes reduced and therefore that 0.1mm detail becomes less distinct. until the point comes where the same eye can’t resolve detail any smaller than 0.2mm at 25cms, and so on.
Now if I try and focus on something at 25 cms my eyeballs start to ache, so we are talking extreme close focus for the eye here.
An interesting side note is that 0.1mm is 100µm (microns) and microns are what we measure the size of sensor photosites in – which brings me nicely to SENSOR resolution.
Sensor Resolution – Too Many Megapixels?
As we saw in the post on NOISE we do not give ourselves the best chances by employing sensors with small photosite diameters. It’s a basic fact of physics and mathematics – the more megapixels on a sensor, then the smaller each photosite has to be in order to fit them all in there; and the smaller they are then the lower is their individual signal to noise or S/N ratio.
But there is another problem that comes with increased sensor resolution:
Increased diffraction threshold.
In the above schematic we are looking at the same sized tiny surface area section on two sensors.
If we say that the sensor resolution on the left is that of a 12Mp Nikon D3, and the ‘area’ contains 3 x 3 photosites which are each 8.4 µm in size, then we can say we are looking at an area of about 25µm square.
On the right we are looking at that same 25µm (25 micron) square, but now it contains 5.2 x 5.2 photosites, each 4.84µm in size – a bit like the sensor resolution of a 36Mp D800.
What is Diffraction?
Diffraction is basically the bending or reflecting of waves by objects placed in their path (not to be confused with refraction). As it pertains to our camera sensor, and overall image quality, it causes an general softening of every single point of sharp detail in the image that is projected onto the sensor during the exposure.
I say during the exposure because diffraction is ‘aperture driven’ and it’s effects only occur when the aperture is ‘stopped down’; which on modern cameras only occurs during the time the shutter is open.
At all other times you are viewing the image with the aperture wide open, and so you can’t see the effect unless you hit the stop down button (if you have one) and even then the image in the viewfinder is so small and dark you can’t see it.
As I said, diffraction is caused by aperture diameter – the size of the hole that lets the light in:
Light enters the lens, passes through the aperture and strikes the focal plane/sensor causing the image to be recorded.
Light waves passing through the center of the aperture and light waves passing through the periphery of the aperture all need to travel the same distance – the focal distance – in order for the image to be sharp.
The potential for the peripheral waves to be bent by the edge of the aperture diaphragm increases as the aperture becomes smaller.
If I apply some randomly chosen numbers to this you might understand it a little better:
Let’s say that the focal distance of the lens (not focal length) is 21.25mm.
As long as light passing through all points of the aperture travels 21.25mm and strikes the sensor then the image will be sharp; in other words, the more parallel the central and peripheral light waves are, then the sharper the image.
Making the aperture narrower by ‘stopping down’ increases the divergence between central and peripheral waves.
This means that peripheral waves have to travel further before the strike the sensor; further than 21.25mm – therefore they are no longer in focus, but those central waves still are. This effect gives a fuzzy halo to every single sharply focused point of light striking our sensor.
Please remember, the numbers I’ve used above are meaningless and random.
The amount of fuzziness varies with aperture – wider aperture = less fuzzy; narrower aperture = more fuzzy, and the circular image produced by a single point of sharp focus is known as an Airy Disc.
As we ‘stop down’ the aperture the edges of the Airy Disc become softer and more fuzzy.
Say for example, we stick a 24mm lens on our camera and frame up a nice landscape, and we need to use f14 to generate the amount of depth of field we need for the shot. The particular lens we are using produces an Airy Disc of a very particular size at any given aperture.
Now here is the problem:
As you can see, the camera with the lower sensor resolution and larger photosite diameter contains the Airy Disc within the footprint of ONE photosite; but the disc effects NINE photosites on the camera with the higher sensor resolution.
Individual photosites basically record one single flat tone which is the average of what they see; so the net outcome of the above scenario is:
On the higher resolution sensor the Airy Disc has produced what we might think of as ‘response pollution’ in the 8 surrounding photosites – these photosites need to record the values of the own ‘bits of the image jigsaw’ as well – so you end up with a situation where each photosite on the sensor ends up recording somewhat imprecise tonal values – this is diffraction in action.
If we were to stop down to f22 or f32 on the lower resolution sensor then the same thing would occur.
If we used an aperture wide enough on the higher resolution sensor – an aperture that generated an Airy Disc that was the same size or smaller than the diameter of the photosites – then only 1 single photosite would be effected and diffraction would not occur.
But that would leave of with a reduced depth of field – getting around that problem is fairly easy if you are prepared to invest in something like a Tilt-Shift lens.
Above we see two images shot with a 24mm Tilt-Shift lens, and both shots are at f3.5 – a wide open aperture. In the left hand image the lens controls are set to zero and so it behaves like a standard construction lens of 24mm and gives the shallow depth of field that you’d expect.
The image on the right is again, shot wide open at f3.5, but this time the lens was tilted down by just 1 degree – now we have depth of field reaching all the way through the image. All we would need to do now is stop the lens down to its sharpest aperture – around f8 – and take the shot; and no worries about diffraction.
Getting back to sensor resolution in general, if your move into high megapixels counts on 35mm format then you are in a ‘Catch 22’ situation:
- Greater sensor resolution enables you to theoretically capture greater levels of detail.
but that extra level of detail is somewhat problematic because:
- Diffraction renders it ‘soft’.
- Eliminating the diffraction causes you to potentially lose the newly acquired level of, say foreground detail in a landscape, due to lack of depth of field.
All digital sensors are susceptible to diffraction at some point or other – they are ‘diffraction limited’.
Over the years I’ve owned a Nikon D3 I’ve found it diffraction limited to between f16 & f18 – I can see it at f18 but can easily rescue the situation. When I first used a 24Mp D3X I forgot what I was using and spent a whole afternoon shooting at f16 & f18 – I had to go back the next day for a re-shoot because the sensor is diffraction limited to f11 – the pictures certainly told the story!
Everything in photography is a trade-off – you can’t have more of one thing without having less of another. Back in the days of film we could get by with one camera and use different films because they had very different performance values, but now we buy a camera and expect its sensor to perform all tasks with equal dexterity – sadly, this is not the case. All modern consumer sensors are jacks of all trades.
If it’s sensor resolution you want then by far the best way to go about it is to jump to medium format, if you want image quality of the n’th degree – this way you get the ‘pixel resolution’ without many of the incumbent problems I’ve mentioned, simply because the sensors are twice the size; or invest in a TS/PC lens and take the Scheimpflug route to more depth of field at a wider aperture.
Become a patron from as little as $1 per month, and help me produce more free content. Patrons gain access to a variety of FREE rewards, discounts and bonuses. |