Pixel Resolution – part 2

More on Pixel Resolution

In my previous post on pixel resolution  I mentioned that it had some serious ramifications for print.

The major one is PHYSICAL or LINEAR image dimension.

In that previous post I said:

  • Pixel dimension divided by pixel resolution = linear dimension

Now, as we saw in the previous post, linear dimension has zero effect on ‘digital display’ image size – here’s those two snake jpegs again:

Andy Astbury,wildlife in pixels,pixel,dpi,ppi,pixel resolution,photoshop,lightroom,adobe

European Adder – 900 x 599 pixels with a pixel resolution of 300PPI

Andy Astbury,wildlife in pixels,pixel,dpi,ppi,pixel resolution,photoshop,lightroom,adobe

European Adder – 900 x 599 pixels with a pixel resolution of 72PPI

Digital display size is driven by pixel dimensionNOT linear dimension or pixel resolution.

Print on the other hand is directly driven by image linear dimension – the physical length and width of our image in inches, centimeters or millimeters.

Now I teach this ‘stuff’ all the time at my Calumet workshops and I know it’s hard for some folk to get their heads around print size and printer output, but it really is simple and straightforward if you just think about it logically for minute.

Let’s get away from snakes and consider this image of a cute Red Squirrel:

Andy Astbury,wildlife in pixels,

Red Squirrel with Bushy Tail – what a cutey!
Shot with Nikon D4 – full frame render.

Yeah yeah – he’s a bit big in the frame for my taste but it’s a seller so boo-hoo – what do I know ! !

Shot on a Nikon D4 – the relevance of which is this:

  • The D4 has a sensor with a linear dimension of 36 x 24 millimeters, but more importantly a photosite dimension of 4928 x 3280. (this is the effective imaging area – total photosite area is 4992 x 3292 according to DXO Labs).

Importing this image into Lightroom, ACR, Bridge, CapOne Pro etc will take that photosite dimension as a pixel dimension.

They also attach the default standard pixel resolution of 300 PPI to the image.

So now the image has a set of physical or linear dimensions:

  • 4928/300  x  3280/300 inches  or  16.43″ x 10.93″

or

  • 417.24 x 277.71 mm for those of you with a metric inclination!

So how big CAN we print this image?

 

Pixel Resolution & Image Physical Dimension

Let’s get back to that sensor for a moment and ask ourselves a question:

  • “Does a sensor contain pixels, and can it have a PPI resolution attached to it?
  • Well, the strict answer would be No and No not really.

But because the photosite dimensions end up being ‘converted’ to pixel dimensions then let’s just for a moment pretend that it can.

The ‘effective’ PPI value for the D4 sensor could be easily derived from its long edge ‘pixel’ count of the FX frame divided by the linear length which is just shy of 36mm or 1.4″ – 3520 PPI or thereabouts.

So, if we take this all literally our camera captures and stores a file that has linear dimensions of  1.4″ x 0.9″, pixel dimensions of  4928 x 3280 and a pixel resolution of 3520 PPI.

Import this file into Lightroom for instance, and that pixel resolution is reduced to 300 PPI.  It’s this very act that renders the image on our monitor at a size we can work with.  Otherwise we’d be working on postage stamps!

And what has that pixel resolution done to the linear image dimensions?  Well it’s basically ‘magnified’ the image – but by how much?

 

Magnification & Image Size

Magnification factors are an important part of digital imaging and image reproduction, so you need to understand something – magnification factors are always calculated on the diagonal.

So we need to identify the diagonals of both our sensor, and our 300 PPI image before we can go any further.

Here is a table of typical sensor diagonals:

Andy Astbury

Table of Sensor Diagonals for Digital Cameras.

And here is a table of metric print media sizes:

Andy Astbury

Metric Paper Sizes including diagonals.

To get back to our 300 PPI image derived from our D4 sensor,  Pythagoras tells us that our 16.43″ x 10.93″ image has a diagonal of 19.73″ – or 501.14mm

So with a sensor diagonal of 43.2mm we arrive at a magnification factor of around 11.6x for our 300 PPI native image as displayed on our monitor.

This means that EVERYTHING on the sensor – photosites/pixels, dust bunnies, logs, lumps of coal, circles of confusion, Airy Discs – the lot – are magnified by that factor.

Just to add variety, a D800/800E produces native 300 PPI images at 24.53″ x 16.37″ – a magnification factor of 17.3x over the sensor size.

So you can now begin to see why pixel resolution is so important when we print.

 

How To Blow Up A Squirrel !

Let’s get back to ‘his cuteness’ and open him up in Photoshop:

Our Squirrel at his native 300 PPI open in Photoshop.

Our Squirrel at his native 300 PPI open in Photoshop.

See how I keep you on your toes – I’ve switched to millimeters now!

The image is 417 x 277 mm – in other words it’s basically A3.

What happens if we hit print using A3 paper?

Red Squirrel with Bushy Tail. D4 file at 300 PPI printed to A3 media.

Red Squirrel with Bushy Tail. D4 file at 300 PPI printed to A3 media.

Whoops – that’s not good at all because there is no margin.  We need workable margins for print handling and for mounting in cut mattes for framing.

Do not print borderless – it’s tacky, messy and it screws your printer up!

What happens if we move up a full A size and print A2:

Red Squirrel 300 PPI printed on A2

Red Squirrel D4 300 PPI printed on A2

Now that’s just over kill.

But let’s open him back up in Photoshop and take a look at that image size dialogue again:

Our Squirrel at his native 300 PPI open in Photoshop.

Our Squirrel at his native 300 PPI open in Photoshop.

If we remove the check mark from the resample section of the image size dialogue box (circled red) and make one simple change:

Our Squirrel at a reduced pixel resolution of 240 PPI open in Photoshop.

Our Squirrel at a reduced pixel resolution of 240 PPI open in Photoshop.

All we need to do is to change the pixel resolution figure from 300 PPI to 240 PPI and click OK.

We make NO apparent change to the image on the monitor display because we haven’t changed any physical dimension and we haven’t resampled the image.

All we have done is tell the print pipeline that every 240 pixels of this image must occupy 1 liner inch of paper – instead of 300 pixels per linear inch of paper.

Let’s have a look at the final outcome:

Red Squirrel D4 240 PPI printed on A2.

Red Squirrel D4 240 PPI printed on A2.

Perfick… as Pop Larkin would say!

Now we have workable margins to the print for both handling and mounting purposes.

But here’s the big thing – printed at 2880+ DPI printer output resolution you would see no difference in visual print quality.  Indeed, 240 PPI was the Adobe Lightroom, ACR default pixel resolution until fairly recently.

So there we go, how big can you print?? – Bigger than you might think!

And it’s all down to pixel resolution – learn to understand it and you’ll find a lot of  the “murky stuff” in photography suddenly becomes very simple!

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Pixel Resolution

What do we mean by Pixel Resolution?

Digital images have two sets of dimensions – physical size or linear dimension (inches, centimeters etc) and pixel dimensions (long edge & short edge).

The physical dimensions are simple enough to understand – the image is so many inches long by so many inches wide.

Pixel dimension is straightforward too – ‘x’ pixels long by ‘y’ pixels wide.

If we divide the physical dimensions by the pixel dimensions we arrive at the PIXEL RESOLUTION.

Let’s say, for example, we have an image with pixel dimensions of 3000 x 2400 pixels, and a physical, linear dimension of 10 x 8 inches.

Therefore:

3000 pixels/10 inches = 300 pixels per inch, or 300PPI

and obviously:

2400 pixels/8 inches = 300 pixels per inch, or 300PPI

So our image has a pixel resolution of 300PPI.

 

How Does Pixel Resolution Influence Image Quality?

In order to answer that question let’s look at the following illustration:

Andy Astbury,pixels,resolution,dpi,ppi,wildlife in pixels

The number of pixels contained in an image of a particular physical size has a massive effect on image quality. CLICK to view full size.

All 7 square images are 0.5 x 0.5 inches square.  The image on the left has 128 pixels per 0.5 inch of physical dimension, therefore its PIXEL RESOLUTION is 2 x 128 PPI (pixels per inch), or 256PPI.

As we move from left to right we halve the number of pixels contained in the image whilst maintaining the physical size of the image – 0.5″ x 0.5″ – so the pixels in effect become larger, and the pixel resolution becomes lower.

The fewer the pixels we have then the less detail we can see – all the way down to the image on the right where the pixel resolution is just 4PPI (2 pixels per 0.5 inch of edge dimension).

The thing to remember about a pixel is this – a single pixel can only contain 1 overall value for hue, saturation and brightness, and from a visual point of view it’s as flat as a pancake in terms of colour and tonality.

So, the more pixels we can have between point A and point B in our image the more variation of colour and tonality we can create.

Greater colour and tonal variation means we preserve MORE DETAIL and we have a greater potential for IMAGE SHARPNESS.

REALITY

So we have our 3 variables; image linear dimension, image pixel dimension and pixel resolution.

In our typical digital work flow the pixel dimension is derived from the the photosite dimension of our camera sensor – so this value is fixed.

All RAW file handlers like Lightroom, ACR etc;  all default to a native pixel resolution of 300PPI. * (this 300ppi myth annoys the hell out of me and I’ll explain all in another post).

So basically the pixel dimension and default resolution SET the image linear dimension.

If our image is destined for PRINT then this fact has some serious ramifications; but if our image is destined for digital display then the implications are very different.

 

Pixel Resolution and Web JPEGS.

Consider the two jpegs below, both derived from the same RAW file:

Andy Astbury,pixels,resolution,dpi,ppi,Wildlife in Pixels

European Adder – 900 x 599 pixels with a pixel resolution of 300PPI

European Adder - 900 x 599 pixels with a pixel resolution of 72PPI

European Adder – 900 x 599 pixels with a pixel resolution of 72PPI

In order to illustrate the three values of linear dimension, pixel dimension and pixel resolution of the two images let’s look at them side by side in Photoshop:

Andy Astbury,photoshop,resolution,pixels,ppi,dpi,wildlife in pixels,image size,image resolution

The two images opened in Photoshop – note the image size dialogue contents – CLICK to view full size.

The two images differ in one respect – their pixel resolutions.  The top Adder is 300PPI, the lower one has a resolution of 72PPI.

The simple fact that these two images appear to be exactly the same size on this page means that, for DIGITAL display the pixel resolution is meaningless when it comes to ‘how big the image is’ on the screen – what makes them appear the same size is their identical pixel dimensions of 900 x 599 pixels.

Digital display devices such as monitors, ipads, laptop monitors etc; are all PIXEL DIMENSION dependent.  The do not understand inches or centimeters, and they display images AT THEIR OWN resolution.

Typical displays and their pixel resolutions:

  • 24″ monitor = typically 75 to 95 PPI
  • 27″ iMac display = 109 PPI
  • iPad 3 or 4 = 264 PPI
  • 15″ Retina Display = 220 PPI
  • Nikon D4 LCD = 494 PPI

Just so that you are sure to understand the implication of what I’ve just said – you CAN NOT see your images at their NATIVE 300 PPI resolution when you are working on them.  Typically you’ll work on your images whilst viewing them at about 1/3rd native pixel resolution.

Yes, you can see 2/3rds native on a 15″ MacBook Pro Retina – but who the hell wants to do this – the display area is minuscule and its display gamut is pathetically small. 😉

Getting back to the two Adder images, you’ll notice that the one thing that does change with pixel resolution is the linear dimensions.

Whilst the 300 PPI version is a tiny 3″ x 2″ image, the 72 PPI version is a whopping 12″ x 8″ by comparison – now you can perhaps understand why I said earlier that the implications of pixel resolution for print are fundamental.

Just FYI – when I decide I’m going to create a small jpeg to post on my website, blog, a forum, Flickr or whatever – I NEVER ‘down sample’ to the usual 72 PPI that get’s touted around by idiots and no-nothing fools as “the essential thing to do”.

What a waste of time and effort!

Exporting a small jpeg at ‘full pixel resolution’ misses out the unnecessary step of down sampling and has an added bonus – anyone trying to send the image direct from browser to a printer ends up with a print the size of a matchbox, not a full sheet of A4.

It won’t stop image theft – but it does confuse ’em!

I’ve got a lot more to say on the topic of resolution and I’ll continue in a later post, but there is one thing related to PPI that is my biggest ‘pet peeve’:

 

PPI and DPI – They Are NOT The Same Thing

Nothing makes my blood boil more than the persistent ‘mix up’ between pixels per inch and dots per inch.

Pixels per inch is EXACTLY what we’ve looked at here – PIXEL RESOLUTION; and it has got absolutely NOTHING to do with dots per inch, which is a measure of printer OUTPUT resolution.

Take a look inside your printer driver; here we are inside the driver for an Epson 3000 printer:

Andy Astbury,printer,dots per inch,dpi,pixels per inch,ppi,photoshop,lightroom,pixel resolution,output resoloution

The Printer Driver for the Epson 3000 printer. Inside the print settings we can see the output resolutions in DPI – Dots Per Inch.

Images would be really tiny if those resolutions were anything to do with pixel density.

It surprises a lot of people when they come to the realisation that pixels are huge in comparison to printer dots – yes, it can take nearly 400 printer dots (20 dots square) to print 1 square pixel in an image at 300 PPI native.

See you in my next post!

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.

Noise and the Camera Sensor

Camera sensors all suffer with two major afflictions; diffraction and noise; and between them these two afflictions cause more consternation amongst photographers than anything else.

In this post I’m going to concentrate on NOISE, that most feared of sensor afflictions, and its biggest influencer – LIGHT, and its properties.

What Is Light?

As humans we perceive light as being a constant continuous stream or flow of electromagnetic energy, but it isn’t!   Instead of flowing like water it behaves more like rain, or indeed, bullets from a machine gun!   Here’s a very basic physics lesson:

Below is a diagram showing the Bohr atomic model.

We have a single positively charged proton (black) forming the nucleus, and a single negatively charged electron (green) orbiting the nucleus.

The orbit distance n1 is defined by the electrostatic balance of the two opposing charges.

Andy Astbury,noise,light,Bohr atomic model

The Bohr Atomic Model

If we apply energy to the system then a ‘tipping point’ is reached and the electron is forced to move away from the nucleus – n2.

Apply even more energy and the system tips again and the electron is forced to move to an even higher energy level – n3.

Now here’s the fun bit – stop applying energy to the system.

As the system is no longer needing to cope with the excess energy it returns to its natural ‘ground’ state and the electron falls back to n1.

In the process the electron sheds the energy it has absorbed – the red squiggly bit – as a quantum, or packet, of electromagnetic energy.

This is basically how a flash gun works.

This ‘packet’ has a start and an end; the start happens as the electron begins its fall back to its ground state; and the end occurs once the electron arrives at n1 – therefore it can perhaps be tentatively thought of as being particulate in nature.

So now you know what Prof. Brian Cox knows – CERN here we come!

Right, so what’s this got to do with photography and camera sensor noise

Camera Sensor Noise

All camera sensors are effected by noise, and this noise comes in various guises:

Firstly, the ‘noise control’ sections of most processing software we use tend to break it down into two components; luminosity, or luminance noise; and colour noise.  Below is a rather crappy image that I’m using to illustrate what we might assume is the reality of noise:

Andy Astbury,noise

This shot shows both Colour & Luminance noise.
The insert shows the shot and the small white rectangle is the area we’re concentrating on.

Now let’s look at the two basic components: Firstly the LUMINANCE component

Andy Astbury,noise

Here we see the LUMINANCE noise component – colour & colour noise components have been removed for clarity.

Next, the COLOUR NOISE bit:

Andy Astbury,noise

The COLOUR NOISE component of the area we’re looking at. All luminance noise has been removed.

I must stress that the majority of colour noise you see in your files inside LR,ACR,CapOne,PS etc: is ‘demosaicing colour noise’, which occurs during the demosaic processes.

But the truth is, it’s not that simple.

Localised random colour errors are generated ‘on sensor’ due to the individual sensor characteristics as we’ll see in a moment, because noise, in truth, comes in various guises that collectively effect luminosity and colour:

Andy Astbury,noise

Shot Noise

This first type of noise is Shot Noise – called so because it’s basically an intrinsic part of the exposure, and is caused by photon flux in the light reflected by the subject/scene.

Remember – we see light in a different way to that of our camera. What we don’t notice is the fact that photon streams rise and fall in intensity – they ‘flux’ – these variations happen far too fast for our eyes to notice, but they do effect the sensor output.

On top of this ‘fluxing’ problem we have something more obvious to consider.

Lighter subjects reflect more light (more photons), darker subjects reflect less light (less photons).

Your exposure is always going to some sort of ‘average’, and so is only going to be ‘accurate’ for certain areas of the scene.

Lighter areas will be leaning towards over exposure; darker areas towards under exposure – your exposure can’t be perfect for all tones contained in the scene.

Tonal areas outside of the ‘average exposure perfection’ – especially the darker ones – may well contain more shot noise.

Shot noise is therefore quite regular in its distribution, but in certain areas it becomes irregular – so its often described as ‘pseudo random’ .

Andy Astbury,noise

Read Noise

Read Noise – now we come to a different category of noise completely.

The image is somewhat exaggerated so that you can see it, but basically this is a ‘zero light’ exposure; take a shot with the lens cap on and this is what happens!

What you can see here is the background sensor noise when you take any shot.

Certain photosites on the sensor are actually generating electrons even in the complete absence of light – seeing as they’re photo-voltaic they shouldn’t be doing this – but they do.

Added to this are AD Converter errors and general ‘system noise’ generated by the camera – so we can regard Read Noise as being like the background hiss, hum and rumble we can hear on a record deck when we turn the Dolby off.

Andy Astbury,noise

Thermal & Pattern Noise

In the same category as Read Noise are two other types of noise – thermal and pattern.

Both again have nothing to do with light falling on the sensor, as this too was shot under a duvet with the lens cap on – a 30 minute exposure at ISO 100 – not beyond stupid when you think of astro photography and star trail shots in particular.

You can see in the example that there are lighter and darker areas especially over towards the right side and top right corner – this is Thermal Noise.

During long exposures the sensor actually heats up, which in turn increases the response of photosites in those areas and causes them to release more electrons.

You can also see distinct vertical and some horizontal banding in the example image – this is pattern noise, yet another sensor noise signature.

Andy Astbury,noise

Under Exposure Noise – pretty much what most photographers think of when they hear the word “noise”.

Read Noise, Pattern Noise, Thermal Noise and to a degree Shot Noise all go together to form a ‘base line noise signature’ for your particular sensor, so when we put them all together and take a shot where we need to tweak the exposure in the shadow areas a little we get an overall Under Exposure Noise characteristic for our camera – which let’s not forget, contains other elements of  both luminance noise and colour noise components derived from the ISO settings we use.

All sensors have a base ISO – this can be thought of as the speed rating which yields the highest Dynamic Range (Dynamic Range falls with increasing ISO values, which is basically under exposure).

At this base ISO the levels of background noise generated by the sensor just being active (Pattern,Read & Thermal) will be at their lowest, and can be thought of as the ‘base noise’ of the sensor.

How visually apparent this base noise level is depends on what is called the Signal to Noise Ratio – the higher the S/N ratio the less you see the noise.

And what is it that gives us a high signal?

MORE Photons – that’s what..!

The more photons each photosite on the sensor can gather during the exposure then the more ‘masked’ will be any internal noise.

And how do we catch more photons?

By using a sensor with BIGGER photosites, a larger pixel pitch – that’s how.  And bigger photosites means LESS MEGAPIXELS – allow me to explain.

Buckets in the Rain A

Here we see a representation of various sized photosites from different sensors.

On the right is the photosite of a Nikon D3s – a massive ‘bucket’ for catching photons in – and 12Mp resolution.

Moving left we have another FX sensor photosite – the D3X at 24Mp, and then the crackpot D800 and it’s mental 36Mp tiny photosite  – can you tell I dislike the D800 yet? 

One the extreme left is the photosite from the 1.5x APS-C D7100 just for comparison.

Now cast your mind back to the start of this post where I said we could tentatively regard photons as particles – well, let’s imagine them as rain drops, and the photosites in the diagram above as different sized buckets.

Let’s put the buckets out in the back yard and let’s make the weather turn to rain:

Andy Astbury,Wildlife in Pixels,sensor resolution,megapixels,pixel pitch,base noise,signal to noise ratio

Various sizes of photosites catching photon rain.

Here it comes…

Andy Astbury,Wildlife in Pixels,sensor resolution,megapixels,pixel pitch,base noise,signal to noise ratio

It’s raining

OK – we’ve had 2 inches of rain in 10 seconds! Make it stop!

Andy Astbury,Wildlife in Pixels,sensor resolution,megapixels,pixel pitch,base noise,signal to noise ratio

All buckets have 2 inches of water in them, but which has caught the biggest volume of rain?

Thank God for that..

If we now get back to reality, we can liken the duration of the rain downpour as shutter speed, the rain drops themselves as photons falling on the sensor, and the consistency of water depth in each ‘bucket’ as a correct level of exposure.

Which bucket has the largest volume of water, or which photosite has captured the most photons – in other words which sensor has the highest S/N Ratio?   That’s right – the 12Mp D3s.

To put this into practical terms let’s consider the next diagram:

Andy Astbury,Wildlife in Pixels,sensor resolution,megapixels,pixel pitch,base noise,signal to noise ratio

Increased pixel pitch = Increased Signal to Noise Ratio

The importance of S/N ratio and its relevance to camera sensor noise can be seen clearly in the diagram above – but we are talking about base noise at native or base ISO.

If we now look at increasing the ISO speed we have a potential problem.

As I mentioned before, increasing ISO is basically UNDER EXPOSURE followed by in-camera “push processing” – now I’m showing my age..

Andy Astbury,noise,iso

The effect of increased ISO – in camera “push processing” automatically lift the exposure value to where the camera thinks it is supposed to be.

By under exposing the image we reduce the overall Signal to Noise Ratio, then the camera internals lift all the levels by a process of amplification – and this includes amplifying  the original level of base noise.

So now you know WHY and HOW your images look noisy at higher ISO’s – or so you’d think – again,  it’s not that simple; take the next two image crops for instance:

Andy Astbury, iso,noise,sensor noise

Kingfisher – ISO 3200 Nikon D4 – POOR LIGHT – Click for bigger view

Andy Astbury, iso,noise,sensor noise

Kingfisher – ISO 3200 Nikon D4 – GOOD LIGHT – CLICK for bigger view

If you click on the images (they’ll open up in new browser tabs) you’ll see that the noise from 3200 ISO on the D4 is a lot more apparent on the image taken in poor light than it is on the image taken in full sun.

You’ll also notice that in both cases the noise is less apparent in the high frequency detail (sharp high detail areas) and more apparent in areas of low frequency detail (blurred background).

So here’s “The Andy Approach” to noise and high ISO.

1. It’s not a good idea to use higher ISO settings just to combat poor light – in poor light everything looks like crap, and if it looks crap then the image will look even crappier.When I get in a poor light situation and I’m not faced with a “shot in a million” then I don’t take the shot.

2. There’s a big difference between poor light and low light that looks good – if that’s the case shoot as close to base ISO as you can get away with in terms of shutter speed.

3. I you shoot landscapes then shoot at base ISO at all times and use a tripod and remote release – make full use of your sensors dynamic range.

4. The Important One – don’t get hooked on megapixels and so-called sensor resolution – I’ve made thousands of landscape sales shot on a 12Mp D3 at 100 ISO. If you are compelled to have more megapixels buy a medium format camera which will generate a higher S/N Ratio because the photosites are larger.

5. If you shoot wildlife you’ll find that the necessity for full dynamic range decreases with angle of view/increasing focal length – using a 500mm lens you are looking at a very small section of what your eye can see, and tones contained within that small window will rarely occupy anywhere near the full camera dynamic range.

Under good light this will allow you to use a higher ISO in order to gain that crucial bit of extra shutter speed – remember, wildlife images tend to be at least 30 to 35% high frequency detail – noise will not be as apparent in these areas as it is in the background; hence to ubiquitous saying of  wildlife photographers “Watch your background at all times”.

Well, I think that’s enough to be going on with – but there’s oh so much more!

Become a patron from as little as $1 per month, and help me produce more free content.

Patrons gain access to a variety of FREE rewards, discounts and bonuses.