Camera Imaging Area
For God’s Sake! Another bloody idiot YouTuber uploaded a video the other day where they were trying out the Fuji GFX 50, and just moments into said video he came out with the same old pile of junk which amounts to “it’s a bigger sensor so it soaks up more light”.
So now his 76,000+ subscribers are misled into believing something that is plain WRONG.
For anyone who didn’t manage to grasp what I was saying in my previous post HERE let’s try attacking this crackpot concept from a different angle.
Devotees of this farcical belief quite often liken larger sensors to bigger windows in a room.
“A bigger window lets in more light” they say.
Erm…no it doesn’t. A bigger window has an increased SURFACE AREA that just lets in a larger area of THE SAME LIGHT VALUE.
A 6 foot square pane of glass has a transmission value that is EXACTLY the same as as a 3 foot square pane of the same glass, therefore a ‘BIGGER WINDOW’ onto the outside world does NOT let in more light.
Imagine we have a room that has a 6×6 foot and 3×3 foot window in one wall. Now go press your nose up to both windows – does the world outside look any different?
No, of course it doesn’t.
The only property that ‘window size’ has any bearing on is the area of ‘illumination foot print’.
So basically the window analogy has ZERO bearing on the matter!
What lets light into the camera is the LENS, not the damn sensor!
The ‘illuminant value’ – or Ev – of the light leaving the back of the lens and entering the imaging plane DOES NOT CHANGE if we swap out our FX body for a crop body – DOES IT!!??!!
So why do these bloody idiots seem to think physics changes when we put a bigger sensor behind the lens? It’s pure abject stupidity.
The imaging area of a sensor has ZERO effect on the intensity of light striking it – that is something that is only influenced by aperture (intensity) and shutter speed (time).
With digital photography, exposure is ‘per unit area’ NOT total area. A dslr sensor is NOT a unit but an amalgamation of individual units called PHOTOSITES or pixels. Hence it is the photosite area that governs exposure NOT sensor total area.
There is a sensor on the market that blows all this ‘sucks in more light’ crap clean out of the water, and that sensor is the Sony IMX224MQV. This is a 1/3 class sensor with a diagonal dimension of just 6.09mm and 1.27Mp. By definition this one hell of a small ‘window’ yet it can ‘see’ light down to 0.005lux with good enough SNR to allow the image processor to capture 120 10bit images per second.
A cameras ‘window onto the world’ is the lens – end of!
Imagine going back to film for a moment – correct exposure value was the same for say Ilford FP4 irrespective of whether you were using 35mm, 120/220 roll film, 5×4 or 10×8 sheet film.
The size of the recording media within the imaging plane was and still is completely irrelevant to exposure.
Bigger recording media never have and never will ‘suck in’ more light, because they can’t suck in more light than the lens is transmitting!
The only properties of the sensor within the imaging area that WILL change how the it reacts to the light transmitted by the lens are:
- Photosite surface area – number of megapixels
- Sensor construction – FSI vs BSI
- Micro lens design
- CFA array absorption characteristics
After my previous post some stroppy idiot emailed me saying Ken Wheeler AKA The Angry Photographer says the Nikon D850 is a full frame Nikon D500, and that because the D850 is an FX camera and has better dynamic range then this proves I’m talking bollocks.
Well, Ken never said this in terms of anything other than approximate pixel density – he’s not that stupid and dick-heads should listen more carefully!
The D500 is an FSI sensor while the D850 is a BSI sensor and has a totally different micro lens design, AD Converter and IP.
Out of the 4 characteristics listed above 3 of them are DRASTICALLY different between the two cameras and the other is different enough to have definite implications – so you cannot compare them ‘like for like’.
But using the same lens, shutter speed, ISO and aperture while imaging a flat white or grey scene the sensor in a D850 will ‘see’ no higher light value than the sensor in a D500.
Why? Because the light emanating from the scene doesn’t change and neither does the light transmitted by the lens.
I own what was the best light meter on the planet – the Sekonic 758. No where does it have a sensor size function/conversion button on it, and neither does its superseding brother the 858!
There are numerous advantages and disadvantages between bigger and smaller sensors but bigger ‘gathering’ or ‘soaking up’ more light isn’t one of them!
So the next time you hear someone say that increased size of the imaging area – bigger sensor size – soaks up more photons you need to stop listening to them because they do not know what the hell they’re talking about.
But if you chose to believe what they say then so be it – in the immortal words of Forrest Gump ” Momma says stoopid is as stoopid does………”
Post Script:
Above you can see the imaging area for various digital sensor formats. You can click the image to view it bigger.
Each imaging area is accurately proportional to the others.
Compare FX to the PhaseOne IQ4. Never, repeat never think that any FX format sensor will ever deliver the same image fidelity as a 645 sensor – it won’t.
Why?
Because look at how much the fine detail in a scene has to be crushed down by the lens to make it ‘fit’ into the sensor imaging area on FX compared to 645.
Andy your talking crap! Am I ? Why do you think the worlds top product and landscape photographers shoot medium format digital?
Here’s the skinny – it’s not because they can afford to, but rather they can’t afford NOT TO.
As for the GFX50 – its imaging area is around 66% that of true MF and it’s smaller than a lot of its ‘wannabe’ owners imagine.