DIGITAL PHOTOGRAPHY TUTORIALS Learn how to take and edit digital photographs using visual tutorials that emphasize concept over procedure, independent of specific digital camera or lens. This is a complete listing of all tutorials on this site; click the drop-down links in the top menu to see particular topics.
CONCEPTS & TERMINOLOGY • • • • • • • • • • • • • • • • • •
How Your Camera Works Understanding Digital Camera Sensors Understanding Camera Exposure: Aperture, ISO & Shutter Speed Understanding Camera Metering Understanding Depth of Field Understanding Camera Lenses: Focal Length & Aperture Understanding White Balance Understanding Camera Autofocus Qualities of Digital Photos Understanding Bit Depth Understanding Sharpness Understanding Image Noise Understanding Dynamic Range Advanced Concepts Digital Camera Sensor Sizes: How Do These Influence Photography? Understanding Diffraction: Pixel Size, Aperture and Airy Disks Digital Cameras vs. The Human Eye (NEW) Understanding the Hyperfocal Distance
CAMERA EQUIPMENT • • • • • • • • • • • • • • • • • • • • • • • • •
Camera Types & Accessories Compact vs. Digital SLR Cameras Selecting & Using a Camera Tripod Camera Flash, Part 1: Light Quality & Appearance Camera Flash, Part 2: Flash Ratios & Exposure Camera Lenses Understanding Camera Lenses: Focal Length & Aperture Using Wide Angle Lenses Using Telephoto Lenses Tilt/Shift Lenses: Using Shift Movements to Control Perspective Tilt/Shift Lenses: Using Tilt Movements to Control Depth of Field Macro Lenses: Magnification, Depth of Field & Effective F-Stop Macro Extension Tubes & Close-up Lenses Lens Characteristics Camera Lens Flare: What It Is and How to Reduce It Camera Lens Quality: MTF, Resolution and Contrast Camera Lens Filters Choosing a Camera Lens Filter: Polarizers, UV, ND & GND Understanding & Using Polarizing Filters Using Graduated Neutral Density (GND) Filters Using Neutral Density (ND) Filters Caring for Your Camera & Photos Digital Camera Sensor Cleaning: Tools & Techniques How to Make Archival Digital Photo Backups How to Protect Online Photos: Copying, Watermarks & Copyrights
PHOTO EDITING & POST-PROCESSING • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • •
Overview: Digital Photo Editing Workflow (NEW) Image Files Understanding Bit Depth Understanding Image Types: JPEG & TIFF Understanding RAW Files: Why Should I Use RAW? Tones & Contrast Understanding Histograms, Part 1: Tones & Contrast Understanding Histograms, Part 2: Luminosity & Color Using the "Levels" Tool in Photoshop Using the "Curves" Tool in Photoshop Understanding Image Posterization Sharpening & Detail Understanding Sharpness Guide to Image Sharpening Sharpening Using the "Unsharp Mask" Tool Using Local Contrast Enhancement Image Resizing Understanding Digital Image Interpolation A Closer Look at Resizing an Image for the Web & Email Optimizing Digital Photo Enlargement Image Stacking & Multiple Exposures Using the High Dynamic Range (HDR) Feature in Photoshop Averaging Images to Reduce Noise Extending Depth of Field Using Focus Stacking F-Stop Stacking: Depth of Field & Corner Sharpness Photo Stitching & Digital Panoramas Photo Stitching Digital Panoramas, Part 1: Overview & Capture Photo Stitching Digital Panoramas, Part 2: Using Stitching Software Photo Stitching Digital Panoramas: Image Projections Specialty Techniques Using Lens Corrections to Improve Image Quality Digital Photo Restoration: Bring Old Photos Back to Life Converting a Color Photo Into Black & White How to Create Animated 3D Stereo Photos
COLOR MANAGEMENT & PRINTING • • • • • • • • • • • •
Concepts A Background on Color Perception Understanding Digital Pixels: PPI, Dithering & Print Size Understanding the Process Color Management, Part 1: Concept & Overview Color Management, Part 2: Color Spaces Color Management, Part 3: Color Space Conversion Understanding Gamma Correction & Digital Tones In Practice How to Calibrate Your Monitor for Photography Soft Proofing: Matching On-Screen Photos with Prints Working Space Comparison: sRGB vs. Adobe RGB 1998
PHOTOGRAPHY TECHNIQUES & STYLES •
Camera Usage
• • • • • • • • • • • • • • • • •
Using Camera Shutter Speed Creatively Reducing Camera Shake with Hand-Held Photos Digital Exposure Techniques: Expose to the Right, Clipping & Noise Subjects Photography in Fog, Mist or Haze Introduction to Macro Photography Intro & Common Obstacles in Night Photography Photo Lighting Making the Most of Natural Light in Photography (NEW) Portrait Lighting with One Light: Introduction Portrait Lighting with Two Lights: Fill Light Specialty Lights: Background, kickers, hair, rim, etc. Overall: Studio Portrait Lighting Styles & Diagrams Composition The Rule of Thirds Using Diagonals for Dynamic Photos Negative Space - Sometimes Less is More
Photography is going through an exciting transition period as many film photographers are beginning to explore the new capabilities of digital cameras. While the fundamentals have remained similar, other aspects are markedly different. This is a great time to get involved with digital photography. These tutorials are rarely influenced by changes in image editing software and camera equipment — due to their unique concept-based approach. Most tutorials therefore apply to both compact and digital SLR cameras. So, don't be afraid to dig deeper! You'll be sure to find what you need regardless of the camera you've purchased.
DIGITAL CAMERA SENSORS A digital camera uses an array of millions of tiny light cavities or "photosites" to record an image. When you press your camera's shutter button and the exposure begins, each of these is uncovered to collect and store photons. Once the exposure finishes, the camera closes each of these photosites, and then tries to assess how many photons fell into each. The relative quantity of photons in each cavity are then sorted into various intensity levels, whose precision is determined by bit depth (0 - 255 for an 8-bit image).
Cavity Array
Light Cavities However, the above illustration would only create grayscale images, since these cavities are unable to distinguish how much they have of each color. To capture color images, a filter has to be placed over each cavity that permits only particular colors of light. Virtually all current digital cameras can only capture one of three primary colors in each cavity, and so they discard roughly 2/3 of the incoming light. As a result, the camera has to approximate the other two primary colors in order to have full color at every pixel. The most common type of color filter array is called a "Bayer array," shown below.
Color Filter Array
Photosites with Color Filters A Bayer array consists of alternating rows of red-green and green-blue filters. Notice how the Bayer array contains twice as many green as red or blue sensors. Each primary color does not receive an equal fraction of the total area because the human eye is more sensitive to green light than both red and blue light. Redundancy with green pixels produces an image which appears less noisy and has finer detail than could
be accomplished if each color were treated equally. This also explains why noise in the green channel is much less than for the other two primary colors (see "Understanding Image Noise" for an example).
Original Scene (shown at 200%)
What Your Camera Sees (through a Bayer array) Note: Not all digital cameras use a Bayer array, however this is by far the most common setup. For example, the Foveon sensor captures all three colors at each pixel location, whereas other sensors might capture four colors in a similar array: red, green, blue and emerald green.
BAYER DEMOSAICING Bayer "demosaicing" is the process of translating this Bayer array of primary colors into a final image which contains full color information at each pixel. How is this possible if the camera is unable to directly measure full color? One way of understanding this is to instead think of each 2x2 array of red, green and blue as a single full color cavity.
→
This would work fine, however most cameras take additional steps to extract even more image information from this color array. If the camera treated all of the colors in each 2x2 array as having landed in the same place, then it would only be able to achieve half the resolution in both the horizontal and vertical directions. On the other hand, if a camera computed the color using several overlapping 2x2 arrays, then it could achieve a higher resolution than would be possible with a single set of 2x2 arrays. The following combination of overlapping 2x2 arrays could be used to extract more image information.
→
Note how we did not calculate image information at the very edges of the array, since we assumed the image continued in each direction. If these were actually the edges of the cavity array, then calculations here would be less accurate, since there are no longer pixels on all sides. This is typically negligible though, since information at the very edges of an image can easily be cropped out for cameras with millions of pixels. Other demosaicing algorithms exist which can extract slightly more resolution, produce images which are less noisy, or adapt to best approximate the image at each location.
DEMOSAICING ARTIFACTS Images with small-scale detail near the resolution limit of the digital sensor can sometimes trick the demosaicing algorithm—producing an unrealistic looking result. The most common artifact is moiré (pronounced "more-ay"), which may appear as repeating patterns, color artifacts or pixels arranged in an unrealistic maze-like pattern:
Second Photo at ↓ 65% of Above Size
Two separate photos are shown above—each at a different magnification. Note the appearance of moirÊ in all four bottom squares, in addition to the third square of the first photo (subtle). Both maze-like and color artifacts can be seen in the third square of the downsized version. These artifacts depend on both the type of texture and software used to develop the digital camera's RAW file. However, even with a theoretically perfect sensor that could capture and distinguish all colors at each photosite, moirÊ and other artifacts could still appear. This is an unavoidable consequence of any system that samples an otherwise continuous signal at discrete intervals or locations. For this reason, virtually every photographic digital sensor incorporates something called an optical low-pass filter (OLPF) or an anti-aliasing (AA) filter. This is typically a thin layer directly in front of the sensor, and works by effectively blurring any potentially problematic details that are finer than the resolution of the sensor.
MICROLENS ARRAYS You might wonder why the first diagram in this tutorial did not place each cavity directly next to each other. Real-world camera sensors do not actually have photosites which cover the entire surface of the sensor. In fact, they may cover just half the total area in order to accommodate other electronics. Each cavity is shown with little peaks between them to direct the photons to one cavity or the other. Digital cameras contain "microlenses" above each photosite to enhance their light-gathering ability. These lenses are analogous to funnels which direct photons into the photosite where the photons would have otherwise been unused.
Well-designed microlenses can improve the photon signal at each photosite, and subsequently create images which have less noise for the same exposure time. Camera manufacturers have been able to use improvements in microlens design to reduce or maintain noise in the latest high-resolution cameras, despite having smaller photosites, due to squeezing more megapixels into the same sensor area.
CAMERA EXPOSURE A photograph's exposure determines how light or dark an image will appear when it's been captured by your camera. Believe it or not, this is determined by just three camera settings: aperture, ISO and shutter speed (the "exposure triangle"). Mastering their use is an essential part of developing an intuition for photography.
UNDERSTANDING EXPOSURE
Achieving the correct exposure is a lot like collecting rain in a bucket. While the rate of rainfall is uncontrollable, three factors remain under your control: the bucket's width, the duration you leave it in the rain, and the quantity of rain you want to collect. You just need to ensure you don't collect too little ("underexposed"), but that you also don't collect too much ("overexposed"). The key is that there are many different combinations of width, time and quantity that will achieve this. For example, for the same quantity of water, you can get away with less time in the rain if you pick a bucket that's really wide. Alternatively, for the same duration left in the rain, a really narrow bucket can be used as long as you plan on getting by with less water. In photography, the exposure settings of aperture, shutter speed and ISO speed are analogous to the width, time and quantity discussed above. Furthermore, just as the rate of rainfall was beyond your control above, so too is natural light for a photographer.
EXPOSURE TRIANGLE: APERTURE, ISO & SHUTTER SPEED
Each setting controls exposure differently: Aperture: controls the area over which light can enter your camera Shutter speed: controls the duration of the exposure ISO speed: controls the sensitivity of your camera's sensor to a given amount of light
One can therefore use many combinations of the above three settings to achieve the same exposure. The key, however, is knowing which trade-offs to make, since each setting also influences other image properties. For example, aperture affects depth of field, shutter speed affects motion blur and ISO speed affects image noise. The next few sections will describe how each setting is specified, what it looks like, and how a given camera exposure mode affects their combination.
SHUTTER SPEED A camera's shutter determines when the camera sensor will be open or closed to incoming light from the camera lens. The shutter speed specifically refers to how long this light is permitted to enter the camera. "Shutter speed" and "exposure time" refer to the same concept, where a faster shutter speed means a shorter exposure time. By the Numbers. Shutter speed's influence on exposure is perhaps the simplest of the three camera settings: it correlates exactly 1:1 with the amount of light entering the camera. For example, when the exposure time doubles the amount of light entering the camera doubles. It's also the setting that has the widest range of possibilities: Shutter Speed 1 - 30+ seconds
Typical Examples Specialty night and low-light photos on a tripod To add a silky look to flowing water 2 - 1/2 second Landscape photos on a tripod for enhanced depth of field To add motion blur to the background of a moving subject 1/2 to 1/30 second Carefully taken hand-held photos with stabilization 1/50 - 1/100 second Typical hand-held photos without substantial zoom To freeze everyday sports/action subject movement 1/250 - 1/500 second Hand-held photos with substantial zoom (telephoto lens) 1/1000 - 1/4000 second To freeze extremely fast, up-close subject motion How it Appears. Shutter speed is a powerful tool for freezing or exaggerating the appearance of motion:
Slow Shutter Speed
Fast Shutter Speed With waterfalls and other creative shots, motion blur is sometimes desirable, but for most other shots this is avoided. Therefore all one usually cares about with shutter speed is whether it results in a sharp photo — either by freezing movement or because the shot can be taken hand-held without camera shake. How do you know which shutter speed will provide a sharp hand-held shot? With digital cameras, the best way to find out is to just experiment and look at the results on your camera's rear LCD screen (at full zoom). If a properly focused photo comes out blurred, then you'll usually need to either increase the shutter speed, keep your hands steadier or use a camera tripod. For more on this topic, see the tutorial on Using Camera Shutter Speed Creatively.
APERTURE SETTING A camera's aperture setting controls the area over which light can pass through your camera lens. It is specified in terms of an f-stop value, which can at times be counterintuitive, because the area of the opening increases as the f-stop decreases. In photographer slang, when someone says they are "stopping down" or "opening up" their lens, they are referring to increasing and decreasing the f-stop value, respectively.
By the Numbers. Every time the f-stop value halves, the light-collecting area quadruples. There's a formula for this, but most photographers just memorize the f-stop numbers that correspond to each doubling/halving of light: Aperture Setting f/22 f/16 f/11 f/8.0 f/5.6 f/4.0 f/2.8 f/2.0 f/1.4
Relative Light 1X 2X 4X 8X 16X 32X 64X 128X 256X
Example Shutter Speed 16 seconds 8 seconds 4 seconds 2 seconds 1 second 1/2 second 1/4 second 1/8 second 1/15 second
The above aperture and shutter speed combinations all result in the same exposure.
Note: Shutter speed values are not always possible in increments of exactly double or half another shutter speed, but they're always close enough that the difference is negligible. The above f-stop numbers are all standard options in any camera, although most also allow finer adjustments of 1/2 or 1/3 stops, such as f/3.2 and f/6.3. The range of values may also vary from camera to camera (or lens to lens). For example, a compact camera might have an available range of f/2.8 to f/8.0, whereas a digital SLR camera might have a range of f/1.4 to f/32 with a portrait lens. A narrow aperture range usually isn't a big problem, but a greater range does provide for more creative flexibility. Technical Note: With many lenses, their light-gathering ability is also affected by their transmission efficiency, although this is almost always much less of a factor than aperture. It's also beyond the photographer's control. Differences in transmision efficiency are typically more pronounced with extreme zoom ranges. For example, Canon's 24-105 mm f/4L IS lens gathers perhaps ~10-40% less light at f/4 than Canon's similar 24-70 mm f/2.8L lens at f/4 (depending on the focal length). How it Appears. A camera's aperture setting is what determines a photo's depth of field (the range of distance over which objects appear in sharp focus). Lower f-stop values correlate with a shallower depth of field:
Wide Aperture f/2.0 - low f-stop number shallow depth of field
Narrow Aperture f/16 - high f-stop number large depth of field
ISO SPEED The ISO speed determines how sensitive the camera is to incoming light. Similar to shutter speed, it also correlates 1:1 with how much the exposure increases or decreases. However, unlike aperture and shutter speed, a lower ISO speed is almost always desirable, since higher ISO speeds dramatically increase image noise. As a result, ISO speed is usually only increased from its minimum value if the desired aperture and shutter speed aren't otherwise obtainable.
Low ISO Speed (low image noise)
High ISO Speed (high image noise) note: image noise is also known as "film grain" in traditional film photography Common ISO speeds include 100, 200, 400 and 800, although many cameras also permit lower or higher values. With compact cameras, an ISO speed in the range of 50-200 generally produces acceptably low image noise, whereas with digital SLR cameras, a range of 50-800 (or higher) is often acceptable.
CAMERA EXPOSURE MODES
Most digital cameras have one of the following standardized exposure modes: Auto ( ), Program (P), Aperture Priority (Av), Shutter Priority (Tv), Manual (M) and Bulb (B) mode. Av, Tv, and M are often called "creative modes" or "auto exposure (AE) modes." Each of these modes influences how aperture, ISO and shutter speed are chosen for a given exposure. Some modes attempt to pick all three values for you, whereas others let you specify one setting and the camera picks the other two (if possible). The following table describes how each mode pertains to exposure: Exposure Mode Auto ( )
How It Works Camera automatically selects all exposure settings. Camera automatically selects aperture & shutter speed; you can choose a corresponding ISO speed & exposure compensation. With some cameras, P Program (P) can also act as a hybrid of the Av & Tv modes. You specify the aperture & ISO; the camera's metering determines the Aperture Priority (Av or A) corresponding shutter speed. You specify the shutter speed & ISO; the camera's metering determines the Shutter Priority (Tv or S) corresponding aperture. You specify the aperture, ISO and shutter speed — regardless of whether Manual (M) these values lead to a correct exposure. Useful for exposures longer than 30 seconds. You specify the aperture and Bulb (B) ISO; the shutter speed is determined by a remote release switch, or by the
duration until you press the shutter button a second time. In addition, the camera may also have several pre-set modes; the most common include landscape, portrait, sports and night mode. The symbols used for each mode vary slightly from camera to camera, but will likely appear similar to those below: Exposure Mode How It Works Camera tries to pick the lowest f-stop value possible for a given exposure. This ensures Portrait the shallowest possible depth of field. Camera tries to pick a high f-stop to ensure a large depth of field. Compact cameras also Landscape often set their focus distance to distant objects or infinity. Camera tries to achieve as fast a shutter speed as possible for a given exposure — ideally Sports/Action 1/250 seconds or faster. In addition to using a low f-stop, the fast shutter speed is usually achieved by increasing the ISO speed more than would otherwise be acceptable in portrait mode. Camera permits shutter speeds which are longer than ordinarily allowed for hand-held shots, and increases the ISO speed to near its maximum available value. However, for Night/Low-light some cameras this setting means that a flash is used for the foreground, and a long shutter speed and high ISO are used to expose the background. Check your camera's instruction manual for any unique characteristics. However, keep in mind that most of the above settings rely on the camera's metering system in order to know what's a proper exposure. For tricky subject matter, metering can often be fooled, so it's a good idea to also be aware of when it might go awry, and what you can do to compensate for such exposure errors (see section on exposure compensation within the camera metering tutorial). Finally, some of the above modes may also control camera settings which are unrelated to exposure, although this varies from camera to camera. Such additional settings might include the autofocus points, metering mode and autofocus modes, amongst others.
CAMERA METERING & EXPOSURE Knowing how your digital camera meters light is critical for achieving consistent and accurate exposures. Metering is the brains behind how your camera determines the shutter speed and aperture, based on lighting conditions and ISO speed. Metering options often include partial, evaluative zone or matrix, center-weighted and spot metering. Each of these have subject lighting conditions for which they excel — and for which they fail. Understanding these can improve one's photographic intuition. Recommended background reading: camera exposure: aperture, ISO & shutter speed
BACKGROUND: INCIDENT vs. REFLECTED LIGHT All in-camera light meters have a fundamental flaw: they can only measure reflected light. This means the best they can do is guess how much light is actually hitting the subject.
If all objects reflected the same percentage of incident light, this would work just fine, however real-world subjects vary greatly in their reflectance. For this reason, in-camera metering is standardized based on the luminance of light which would be reflected from an object appearing as middle gray. If the camera is aimed directly at any object lighter or darker than middle gray, the camera's light meter will incorrectly calculate under or over-exposure, respectively. A hand-held light meter would calculate the same exposure for any object under the same incident lighting.
18% Gray Tone
18% Red Tone
18% Green Tone
18% Blue Tone
Above patches depict approximations of 18% luminance. This will appear most accurate when using a PC display which closely mimics the sRGB color space, and have calibrated your monitor accordingly. Monitors emit as opposed to reflect light, so this is also a fundamental limitation. What constitutes middle gray? In the printing industry it is standardized as the ink density which reflects 18% of incident light, however cameras seldom adhere to this. This topic deserves a discussion of its own, but for the purposes of this tutorial, just know that each camera treats middle gray slightly differently, but that it's usually somewhere between 10-18% reflectance. Metering off a subject which reflects more or less light than this may cause your camera's metering algorithm to go awry — either through under or overexposure, respectively.
An in-camera light meter can work surprisingly well if object reflectance is sufficiently diverse throughout the photo. In other words, if there is an even spread varying from dark to light objects, then the average reflectance will remain roughly middle gray. Unfortunately, some scenes may have a significant imbalance in subject reflectivity, such as a photo of a white dove in the snow, or of a black dog sitting on a pile of charcoal. For such cases the camera may try to create an image with a histogram whose primary peak is in the midtones, even though it should have instead produced this peak in the highlights or shadows (see high and low-key histograms).
METERING OPTIONS In order to accurately expose a greater range of subject lighting and reflectance combinations, most cameras have several metering options. Each option works by assigning a relative weighting to different light regions; regions with a higher weighting are considered more reliable, and thus contribute more to the final exposure calculation.
Center-Weighted
Partial Metering
Spot Metering Partial and spot areas are roughly 13.5% and 3.8% of the picture area, respectively, which correspond to settings on the Canon EOS 1D Mark II.
The whitest regions are those which contribute most towards the exposure calculation, whereas black areas are ignored. Each of the above metering diagrams may also be located off-center, depending on the metering options and autofocus point used.
More sophisticated algorithms may go beyond just a regional map and include: evaluative, zone and matrix metering. These are usually the default when your camera is set to auto exposure. Each generally works by dividing the image up into numerous sub-sections, where each section is then considered in terms of its relative location, light intensity or color. The location of the autofocus point and orientation of the camera (portrait vs. landscape) may also contribute to the calculation.
WHEN TO USE PARTIAL & SPOT METERING Partial and spot metering give the photographer far more control over the exposure than any of the other settings, but this also means that these are more difficult to use — at least initially. They are useful when there is a relatively small object within your scene which you either need to be perfectly exposed, or know will provide the closest match to middle gray. One of the most common applications of partial metering is a portrait of someone who is back-lit. Metering off their face can help avoid an exposure that makes the subject appear as an under-exposed silhouette against the bright background. On the other hand, care should be taken as the shade of a person's skin may lead to inaccurate exposure if this shade is far from neutral gray reflectance (although not by as much as with backlighting. Spot metering is used less often because its metering area is very small and thus quite specific. This can be an advantage when you are unsure of your subject's reflectance and have a specially designed gray card (or other small object) to meter off of.
Spot and partial metering are also quite useful for creative exposures, and when ambient lighting is unusual. In the examples to the left and right below, one could meter off the diffusely lit foreground tiles, or off the directly lit stone below the sky opening:
NOTES ON CENTER-WEIGHTED METERING Center-weighted metering was once a very common default setting in cameras because it coped well with a bright sky above a darker landscape. Nowadays, it has more or less been surpassed in flexibility by evaluative and matrix, and in specificity by partial and spot metering. On the other hand, the results produced by center-weighted metering are very predictable, whereas matrix and evaluative metering
modes have complicated algorithms which are harder to predict. For this reason some still prefer to use center-weighted as the default metering mode.
EXPOSURE COMPENSATION Any of the above metering modes can use a feature called exposure compensation (EC). When this is activated, the metering calculation still works as normal, but the final the final exposure target gets compensated by the EC value. This allows for manual corrections if you observe a metering mode to be consistently under or over-exposing. Most cameras allow up to 2 stops of exposure compensation, where each stop provides either a doubling or halving of light compared to what the metering mode would have done otherwise. A setting of zero means no compensation will be applied (which is the default). Exposure compensation is ideal for correcting in-camera metering errors caused by the subject's reflectivity. No matter what metering mode is used, an in-camera light meter will always mistakenly under-expose a subject such as a white dove in a snowstorm (see incident vs. reflected light). Photographs in the snow will therefore always require around +1 exposure compensation, whereas a low-key image may require negative compensation. When shooting in RAW mode under tricky lighting, sometimes it is useful to set a slight negative exposure compensation (0.3-0.5). This decreases the chance of clipped highlights, yet still allows one to increase the exposure afterwards. Alternatively, a positive exposure compensation can be used to improve the signal to noise ratio in situations where the highlights are far from clipping.
TUTORIALS: DEPTH OF FIELD Depth of field refers to the range of distance that appears acceptably sharp. It varies depending on camera type, aperture and focusing distance, although print size and viewing distance can also influence our perception of depth of field. This tutorial is designed to give a better intuitive and technical understanding for photography, and provides a depth of field calculator to show how it varies with your camera settings.
The depth of field does not abruptly change from sharp to unsharp, but instead occurs as a gradual transition. In fact, everything immediately in front of or in back of the focusing distance begins to lose sharpness — even if this is not perceived by our eyes or by the resolution of the camera.
CIRCLE OF CONFUSION
Since there is no critical point of transition, a more rigorous term called the "circle of confusion" is used to define how much a point needs to be blurred in order to be perceived as unsharp. When the circle of confusion becomes perceptible to our eyes, this region is said to be outside the depth of field and thus no
longer "acceptably sharp." The circle of confusion above has been exaggerated for clarity; in reality this would be only a tiny fraction of the camera sensor's area.
When does the circle of confusion become perceptible to our eyes? An acceptably sharp circle of confusion is loosely defined as one which would go unnoticed when enlarged to a standard 8x10 inch print, and observed from a standard viewing distance of about 1 foot.
At this viewing distance and print size, camera manufacturers assume a circle of confusion is negligible if no larger than 0.01 inches (when enlarged). As a result, camera manufacturers use the 0.01 inch standard when providing lens depth of field markers (shown below for f/22 on a 50mm lens). In reality, a person with 20/20 vision or better can distinguish features 1/3 this size, and so the circle of confusion has to be even smaller than this to achieve acceptable sharpness throughout. A different maximum circle of confusion also applies for each print size and viewing distance combination. In the earlier example of blurred dots, the circle of confusion is actually smaller than the resolution of your screen for the two dots on either side of the focal point, and so these are considered within the depth of field. Alternatively, the depth of field can be based on when the circle of confusion becomes larger than the size of your digital camera's pixels. Note that depth of field only sets a maximum value for the circle of confusion, and does not describe what happens to regions once they become out of focus. These regions are also called "bokeh," from Japanese (pronounced bo-kĂŠ). Two images with identical depth of field may have significantly different bokeh, as this depends on the shape of the lens diaphragm. In reality, the circle of confusion is usually not actually a circle, but is only approximated as such when it is very small. When it becomes large, most lenses will render it as a polygonal shape with 5-8 sides.
CONTROLLING DEPTH OF FIELD Although print size and viewing distance influence how large the circle of confusion appears to our eyes, aperture and focusing distance distance are the two main factors that determine how big the circle of confusion will be on your camera's sensor. Larger apertures (smaller F-stop number) and closer focusing distances produce a shallower depth of field. The following test maintains the same focus distance, but changes the aperture setting:
f/8.0
f/5.6
f/2.8 note: images taken with a 200 mm lens (320 mm field of view on a 35 mm camera)
CLARIFICATION: FOCAL LENGTH AND DEPTH OF FIELD Note that focal length has not been listed as influencing depth of field, contrary to popular belief. Even though telephoto lenses appear to create a much shallower depth of field, this is mainly because they are often used to magnify the subject when one is unable to get closer. If the subject occupies the same fraction of the image (constant magnification) for both a telephoto and a wide angle lens, the total depth of field is
virtually* constant with focal length! This would of course require you to either get much closer with a wide angle lens or much further with a telephoto lens, as demonstrated in the following chart: Focal Length (mm) 10 20 50 100 200 400
Focus Distance (m) 0.5 1.0 2.5 5.0 10 20
Depth of Field (m) 0.482 0.421 0.406 0.404 0.404 0.404
Note: Depth of field calculations are at f/4.0 on a camera with a 1.6X crop factor, using a circle of confusion of 0.0206 mm. Note how there is indeed a subtle change for the smallest focal lengths. This is a real effect, but is negligible compared to both aperture and focusing distance. Even though the total depth of field is virtually constant, the fraction of the depth of field which is in front of and behind the focus distance does change with focal length, as demonstrated below:
Focal Length (mm) 10 20 50 100 200 400
70.2 % 60.1 % 54.0 % 52.0 % 51.0 % 50.5 %
Distribution of the Depth of Field Rear Front 29.8 % 39.9 % 46.0 % 48.0 % 49.0 % 49.5 %
This exposes a limitation of the traditional DoF concept: it only accounts for the total DoF and not its distribution around the focal plane, even though both may contribute to the perception of sharpness. Note how a wide angle lens provides a more gradually fading DoF behind the focal plane than in front, which is important for traditional landscape photographs. Longer focal lengths may also appear to have a shallower depth of field because they enlarge the background relative to the foreground (due to their narrower angle of view). This can make an out of focus background look even more out of focus because its blur has become enlarged. However, this is another concept entirely, since depth of field only describes the sharp region of a photo — not the blurred regions. On the other hand, when standing in the same place and focusing on a subject at the same distance, a longer focal length lens will have a shallower depth of field (even though the pictures will frame the subject entirely differently). This is more representative of everyday use, but is an effect due to higher magnification, not focal length. Depth of field also appears shallower for SLR cameras than for compact digital cameras, because SLR cameras require a longer focal length to achieve the same field of view (see the tutorial on digital camera sensor sizes for more on this topic). *Technical Note: We describe depth of field as being virtually constant because there are limiting cases where this does not hold true. For focal distances resulting in high magnification, or very near the hyperfocal distance, wide angle lenses may provide a greater DoF than telephoto lenses. On the other hand, at high magnification the traditional DoF calculation becomes inaccurate due to another factor: pupil magnification. This reduces the DoF advantage for most wide angle lenses, and increases it for telephoto and macro lenses. At the other limiting case, near the hyperfocal distance, the increase in DoF arises
because the wide angle lens has a greater rear DoF, and can thus more easily attain critical sharpness at infinity.
CALCULATING DEPTH OF FIELD In order to calculate the depth of field, one needs to first decide on an appropriate value for the maximum allowable circle of confusion. This is based on both the camera type (sensor or film size), and on the viewing distance / print size combination. Needless to say, knowing what this will be ahead of time often isn't straightforward. Try out the depth of field calculator tool to help you find this for your specific situation.
DEPTH OF FOCUS & APERTURE VISUALIZATION Another implication of the circle of confusion is the concept of depth of focus (also called the "focus spread"). It differs from depth of field because it describes the distance over which light is focused at the camera's sensor, as opposed to the subject:
Diagram depicting depth of focus versus camera aperture. The purple lines comprising the edge of each shaded region represent the extreme angles at which light could potentially enter the aperture. The interior of the purple shaded regions represents all other possible angles. The key concept is this: when an object is in focus, light rays originating from that point converge at a point on the camera's sensor. If the light rays hit the sensor at slightly different locations (arriving at a disc instead of a point), then this object will be rendered as out of focus — and increasingly so depending on how far apart the light rays are.
OTHER NOTES Why not just use the smallest aperture (largest number) to achieve the best possible depth of field? Other than the fact that this may require prohibitively long shutter speeds without a camera tripod, too small of an aperture softens the image by creating a larger circle of confusion (or "Airy disk") due to an effect called diffraction — even within the plane of focus. Diffraction quickly becomes more of a limiting factor than depth of field as the aperture gets smaller. Despite their extreme depth of field, this is also why "pinhole cameras" have limited resolution. For macro photography (high magnification), the depth of field is actually influenced by another factor: pupil magnification. This is equal to one for lenses which are internally symmetric, although for wide angle and telephoto lenses this is greater or less than one, respectively. A greater depth of field is achieved (than would be ordinarily calculated) for a pupil magnification less than one, whereas the pupil magnification does not change the calculation when it is equal to one. The problem is that the pupil magnification is usually not provided by lens manufacturers, and one can only roughly estimate it visually.
OTHER WEBSITES & FURTHER READING
• •
Norman Koren provides another perspective on depth of field, including many equations for calculating depth of field and the circle of confusion The Luminous Landscape compares the depth of field for several focal lengths — providing visual proof that depth of field does not change much with the focal length.
UNDERSTANDING CAMERA LENSES Understanding camera lenses can help add more creative control to digital photography. Choosing the right lens for the task can become a complex trade-off between cost, size, weight, lens speed and image quality. This tutorial aims to improve understanding by providing an introductory overview of concepts relating to image quality, focal length, perspective, prime vs. zoom lenses and aperture or f-number.
LENS ELEMENTS & IMAGE QUALITY All but the simplest cameras contain lenses which are actually comprised of several "lens elements." Each of these elements directs the path of light rays to recreate the image as accurately as possible on the digital sensor. The goal is to minimize aberrations, while still utilizing the fewest and least expensive elements.
Optical aberrations occur when points in the image do not translate back onto single points after passing through the lens — causing image blurring, reduced contrast or misalignment of colors (chromatic aberration). Lenses may also suffer from uneven, radially decreasing image brightness (vignetting) or distortion. Move your mouse over each of the options below to see how these can impact image quality in extreme cases:
Original Image
Loss of Contrast
Blurring
Chromatic Aberration
Distortion
Vignetting
Original
Any of the above problems is present to some degree with any lens. In the rest of this tutorial, when a lens is referred to as having lower optical quality than another lens, this is manifested as some combination of the above artifacts. Some of these lens artifacts may not be as objectionable as others, depending on the subject matter. Note: For a more quantitative and technical discussion of the above topic, please see the tutorial on camera lens quality: MTF, resolution & contrast.
INFLUENCE OF LENS FOCAL LENGTH The focal length of a lens determines its angle of view, and thus also how much the subject will be magnified for a given photographic position. Wide angle lenses have short focal lengths, while telephoto lenses have longer corresponding focal lengths.
Note: The location where light rays cross is not necessarily equal to the focal length, as shown above, but is instead roughly proportional to this distance.
Required Focal Length Calculator
Subject Distance
36
meters
Subject Size
2
meters
Camera Type Digital SLR with CF of 1.6X Required Focal Length:
400.5 mm
Note: Calculator assumes that camera is oriented such that the maximum subject dimension given by "subject size" is in the camera's longest dimension. Calculator not intended for use in extreme macro photography.
Many will say that focal length also determines the perspective of an image, but strictly speaking, perspective only changes with one's location relative to their subject. If one tries to fill the frame with the same subjects using both a wide angle and telephoto lens, then perspective does indeed change, because one is forced to move closer or further from their subject. For these scenarios only, the wide angle lens exaggerates or stretches perspective, whereas the telephoto lens compresses or flattens perspective.
Perspective control can be a powerful compositional tool in photography, and often determines one's choice in focal length (when one can photograph from any position). Move your mouse over the above image to view an exaggerated perspective due to a wider angle lens. Note how the subjects within the frame remain nearly identical — therefore requiring a closer position for the wider angle lens. The relative sizes of objects change such that the distant doorway becomes smaller relative to the nearby lamps. The following table provides an overview of what focal lengths are required to be considered a wide angle or telephoto lens, in addition to their typical uses. Please note that focal lengths listed are just rough ranges, and actual uses may vary considerably; many use telephoto lenses in distant landscapes to compress perspective, for example. Lens Focal Length*
Terminology
Typical Photography
Less than 21 mm
Extreme Wide Angle Architecture
21-35 mm
Wide Angle
Landscape
35-70 mm
Normal
Street & Documentary
70-135 mm
Medium Telephoto Portraiture
135-300+ mm
Telephoto
Sports, Bird & Wildlife
*Note: Lens focal lengths are for 35 mm equivalent cameras. If you have a compact or digital SLR camera, then you likely have a different sensor size. To adjust the above numbers for your camera, please use the focal length converter in the tutorial on digital camera sensor sizes. Other factors may also be influenced by lens focal length. Telephoto lenses are more susceptible to camera shake since small hand movements become magnified, similar to the shakiness experience while trying to look through binoculars. Wide angle lenses are generally more resistant to flare, in part because the designers assume that the sun is more likely to be within the frame. A final consideration is that medium and telephoto lenses generally yield better optical quality for similar price ranges.
FOCAL LENGTH & HANDHELD PHOTOS
The focal length of a lens may also have a significant impact on how easy it is to achieve a sharp handheld photograph. Longer focal lengths require shorter exposure times to minimize blurring caused by shaky hands. Think of this as if one were trying to hold a laser pointer steady; when shining this pointer at a nearby object its bright spot ordinarily jumps around less than for objects further away.
This is primarily because slight rotational vibrations are magnified greatly with distance, whereas if only up and down or side to side vibrations were present, the laser's bright spot would not change with distance.
A common rule of thumb for estimating how fast the exposure needs to be for a given focal length is the one over focal length rule. This states that for a 35 mm camera, the exposure time needs to be at least as fast as one over the focal length in seconds. In other words, when using a 200 mm focal length on a 35 mm camera, the exposure time needs to be at least 1/200 seconds — otherwise blurring may be hard to avoid. See the tutorial on reducing camera shake with hand-held photos for more on this topic. Keep in mind that this rule is just for rough guidance; some may be able to hand hold a shot for much longer or shorter times. For users of digital cameras with cropped sensors, one needs to convert into a 35 mm equivalent focal length.
ZOOM LENSES vs. PRIME LENSES A zoom lens is one where the photographer can vary the focal length within a pre-defined range, whereas this cannot be changed with a "prime" or fixed focal length lens. The primary advantage of a zoom lens is that it is easier to achieve a variety of compositions or perspectives (since lens changes are not necessary). This advantage is often critical for dynamic subject matter, such as in photojournalism and children's photography. Keep in mind that using a zoom lens does not necessarily mean that one no longer has to change their position; zooms just increase flexibility. In the example below, the original position is shown along with two alternatives using a zoom lens. If a prime lens were used, then a change of composition would not have been possible without cropping the image (if a tighter composition were desirable). Similar to the example in the previous section, the change of perspective was achieved by zooming out and getting closer to the subject. Alternatively, to achieve the opposite perspective effect, one could have zoomed in and moved further from the subject.
Two Options Available with a Zoom Lens: Change of Composition
Change of Perspective
Why would one intentionally restrict their options by using a prime lens?Prime lenses existed long before zoom lenses were available, and still offer many advantages over their more modern counterparts. When zoom lenses first arrived on the market, one often had to be willing to sacrifice a significant amount of optical quality. However, more recent high-end zoom lenses generally do not produce noticeably lower image quality, unless scrutinized by the trained eye (or in a very large print). The primary advantages of prime lenses are in cost, weight and speed. An inexpensive prime lens can generally provide as good (or better) image quality as a high-end zoom lens. Additionally, if only a small fraction of the focal length range is necessary for a zoom lens, then a prime lens with a similar focal length will be significantly smaller and lighter. Finally, the best prime lenses almost always offer better light-gathering ability (larger maximum aperture) than the fastest zoom lenses — often critical for lowlight sports/theater photography, and when a shallow depth of field is necessary. For compact digital cameras, lenses listed with a 3X, 4X, etc. zoom designation refer to the ratio between the longest and shortest focal lengths. Therefore, a larger zoom designation does not necessarily mean that the image can be magnified any more (since that zoom may just have a wider angle of view when fully zoomed out). Additionally, digital zoom is not the same as optical zoom, as the former only enlarges the image through interpolation. Read the fine-print to ensure you are not misled.
INFLUENCE OF LENS APERTURE OR F-NUMBER The aperture range of a lens refers to the amount that the lens can open up or close down to let in more or less light, respectively. Apertures are listed in terms of f-numbers, which quantitatively describe relative light-gathering area (depicted below).
Note: Aperture opening (iris) is rarely a perfect circle, due to the presence of 5-8 blade-like lens diaphragms. Note that larger aperture openings are defined to have lower f-numbers (often very confusing). These two terms are often mistakenly interchanged; the rest of this tutorial refers to lenses in terms of their aperture size. Lenses with larger apertures are also described as being "faster," because for a given ISO speed,
the shutter speed can be made faster for the same exposure. Additionally, a smaller aperture means that objects can be in focus over a wider range of distance, a concept also termed the depth of field. Corresponding Impact on Other Properties: f-#
Light-Gathering Area Required Shutter Speed Depth of Field (Aperture Size)
Higher Smaller
Slower
Wider
Lower Larger
Faster
Narrower
When one is considering purchasing a lens, specifications ordinarily list the maximum (and maybe minimum) available apertures. Lenses with a greater range of aperture settings provide greater artistic flexibility, in terms of both exposure options and depth of field. The maximum aperture is perhaps the most important lens aperture specification, which is often listed on the box along with focal length(s).
An f-number of X may also be displayed as 1:X (instead of f/X), as shown below for the Canon 70-200 f/2.8 lens (whose box is also shown above and lists f/2.8).
Portrait and indoor sports/theater photography often requires lenses with very large maximum apertures, in order to be capable of a narrower depth of field or a faster shutter speed, respectively. The narrow depth of field in a portrait helps isolate the subject from their background. For digital SLR cameras, lenses with larger maximum apertures provide significantly brighter viewfinder images — possibly critical for night and low-light photography. These also often give faster and more accurate auto-focusing in lowlight. Manual focusing is also easier because the image in the viewfinder has a narrower depth of field (thus making it more visible when objects come into or out of focus). Typical Maximum Apertures
Relative Light-Gathering Ability
f/1.0
32X
f/1.4
16X
Typical Lens Types Fastest Available Prime Lenses (for Consumer Use)
Fast Prime Lenses f/2.0
8X
f/2.8
4X
f/4.0
2X
f/5.6
1X
Fastest Zoom Lenses (for Constant Aperture) Light Weight Zoom Lenses or Extreme Telephoto Primes
Minimum apertures for lenses are generally nowhere near as important as maximum apertures. This is primarily because the minimum apertures are rarely used due to photo blurring from lens diffraction, and because these may require prohibitively long exposure times. For cases where extreme depth of field is desired, then smaller minimum aperture (larger maximum f-number) lenses allow for a wider depth of field.
Finally, some zoom lenses on digital SLR and compact digital cameras often list a range of maximum aperture, because this may depend on how far one has zoomed in or out. These aperture ranges therefore refer only to the range of maximum aperture, not overall range. A range of f/2.0-3.0 would mean that the maximum available aperture gradually changes from f/2.0 (fully zoomed out) to f/3.0 (at full zoom). The primary benefit of having a zoom lens with a constant maximum aperture is that exposure settings are more predictable, regardless of focal length. Also note that just because the maximum aperture of a lens may not be used, this does not necessarily mean that this lens is not necessary. Lenses typically have fewer aberrations when they perform the exposure stopped down one or two f-stops from their maximum aperture (such as using a setting of f/4.0 on a lens with a maximum aperture of f/2.0). This *may* therefore mean that if one wanted the best quality f/2.8 photograph, a f/2.0 or f/1.4 lens may yield higher quality than a lens with a maximum aperture of f/2.8. Other considerations include cost, size and weight. Lenses with larger maximum apertures are typically much heavier, larger and more expensive. Size/weight may be critical for wildlife, hiking and travel photography because all of these often utilize heavier lenses, or require carrying equipment for extended periods of time.
FURTHER READING For more on camera lenses, also visit the following tutorials: • • •
Using Wide Angle Lenses Using Telephoto Lenses Macro Lenses: Magnification, Depth of Field & Effective F-Stop
TUTORIALS: WHITE BALANCE White balance (WB) is the process of removing unrealistic color casts, so that objects which appear white in person are rendered white in your photo. Proper camera white balance has to take into account the "color temperature" of a light source, which refers to the relative warmth or coolness of white light. Our eyes are very good at judging what is white under different light sources, but digital cameras often have great difficulty with auto white balance (AWB) — and can create unsightly blue, orange, or even green color casts. Understanding digital white balance can help you avoid these color casts, thereby improving your photos under a wider range of lighting conditions.
Color Cast →
Daylight White Balance
BACKGROUND: COLOR TEMPERATURE Color temperature describes the spectrum of light which is radiated from a "blackbody" with that surface temperature. A blackbody is an object which absorbs all incident light — neither reflecting it nor allowing it to pass through. A rough analogue of blackbody radiation in our day to day experience might be in heating a metal or stone: these are said to become "red hot" when they attain one temperature, and then "white hot" for even higher temperatures. Similarly, blackbodies at different temperatures also have varying color temperatures of "white light." Despite its name, light which may appear white does not necessarily contain an even distribution of colors across the visible spectrum:
Relative intensity has been normalized for each temperature (in Kelvins). Note how 5000 K produces roughly neutral light, whereas 3000 K and 9000 K produce light spectrums which shift to contain more orange and blue wavelengths, respectively. As the color temperature rises, the color distribution becomes cooler. This may not seem intuitive, but results from the fact that shorter wavelengths contain light of higher energy. Why is color temperature a useful description of light for photographers, if they never deal with true blackbodies? Fortunately, light sources such as daylight and tungsten bulbs closely mimic the distribution of light created by blackbodies, although others such as fluorescent and most commercial lighting depart from blackbodies significantly. Since photographers never use the term color temperature to refer to a true blackbody light source, the term is implied to be a "correlated color temperature" with a similarly colored blackbody. The following table is a rule-of-thumb guide to the correlated color temperature of some common light sources: Color Temperature 1000-2000 K 2500-3500 K 3000-4000 K 4000-5000 K 5000-5500 K 5000-6500 K 6500-8000 K 9000-10000 K
Light Source Candlelight Tungsten Bulb (household variety) Sunrise/Sunset (clear sky) Fluorescent Lamps Electronic Flash Daylight with Clear Sky (sun overhead) Moderately Overcast Sky Shade or Heavily Overcast Sky
IN PRACTICE: JPEG & TIFF FILES Since some light sources do not resemble blackbody radiators, white balance uses a second variable in addition to color temperature: the green-magenta shift. Adjusting the green-magenta shift is often unnecessary under ordinary daylight, however fluorescent and other artificial lighting may require significant green-magenta adjustments to the WB. Auto White Balance Custom Kelvin Tungsten
Fluorescent Daylight Flash Cloudy Shade Fortunately,most digital cameras contain a variety of preset white balances, so you do not have to deal with color temperature and green-magenta shift during the critical shot. Commonly used symbols for each of these are listed to the left. The first three white balances allow for a range of color temperatures. Auto white balance is available in all digital cameras and uses a best guess algorithm within a limited range — usually between 3000/4000 K and 7000 K. Custom white balance allows you to take a picture of a known gray reference under the same lighting, and then set that as the white balance for future photos. With "Kelvin" you can set the color temperature over a broad range. The remaining six white balances are listed in order of increasing color temperature, however many compact cameras do not include a shade white balance. Some cameras also include a "Fluorescent H" setting, which is designed to work in newer daylight-calibrated fluorescents. The description and symbol for the above white balances are just rough estimates for the actual lighting they work best under. In fact, cloudy could be used in place of daylight depending on the time of day, elevation, or degree of haziness. In general, if your image appears too cool on your LCD screen preview (regardless of the setting), you can quickly increase the color temperature by selecting a symbol further down on the list above. If the image is still too cool (or warm if going the other direction), you can resort to manually entering a temperature in the Kelvin setting. If all else fails and the image still does not have the correct WB after inspecting it on a computer afterwards, you can adjust the color balance to remove additional color casts. Alternatively, one could click on a colorless reference (see section on neutral references) with the "set gray point" dropper while using the "levels" tool in Photoshop. Either of these methods should be avoided since they can severely reduce the bit depth of your image.
IN PRACTICE: THE RAW FILE FORMAT By far the best white balance solution is to photograph using the RAW file format (if your camera supports them), as these allow you to set the WB *after* the photo has been taken. RAW files also allow one to set the WB based on a broader range of color temperature and green-magenta shifts. Performing a white balance with a raw file is quick and easy. You can either adjust the temperature and green-magenta sliders until color casts are removed, or you can simply click on a neutral reference within the image (see next section). Even if only one of your photos contains a neutral reference, you can click on it and then use the resulting WB settings for the remainder of your photos (assuming the same lighting).
CUSTOM WHITE BALANCE: CHOOSING A NEUTRAL REFERENCE A neutral reference is often used for color-critical projects, or for situations where one anticipates auto white balance will encounter problems. Neutral references can either be parts of your scene (if you're
lucky), or can be a portable item which you carry with you. Below is an example of a fortunate reference in an otherwise bluish twilight scene.
On the other hand, pre-made portable references are almost always more accurate since one can easily be tricked into thinking an object is neutral when it is not. Portable references can be expensive and specifically designed for photography, or may include less expensive household items. An ideal gray reference is one which reflects all colors in the spectrum equally, and can consistently do so under a broad range of color temperatures. An example of a pre-made gray reference is shown below:
Common household neutral references are the underside of a lid to a coffee or pringles container. These are both inexpensive and reasonably accurate, although custom-made photographic references are the best (such as the cards shown above). Custom-made devices can be used to measure either the incident or reflected color temperature of the illuminant. Most neutral references measure reflected light, whereas a device such as a white balance meter or an "ExpoDisc" can measure incident light (and can theoretically be more accurate). Care should be taken when using a neutral reference with high image noise, since clicking on a seemingly gray region may actually select a colorful pixel caused by color noise:
Low Noise (Smooth Colorless Gray)
High Noise (Patches of Color) If your software supports it, the best solution for white balancing with noisy images is to use the average of pixels with a noisy gray region as your reference. This can be either a 3x3 or 5x5 pixel average if using Adobe Photoshop.
NOTES ON AUTO WHITE BALANCE Certain subjects create problems for a digital camera's auto white balance — even under normal daylight conditions. One example is if the image already has an overabundance of warmth or coolness due to unique subject matter. The image below illustrates a situation where the subject is predominantly red, and so the camera mistakes this for a color cast induced by a warm light source. The camera then tries to compensate for this so that the average color of the image is closer to neutral, but in doing so it unknowingly creates a bluish color cast on the stones. Some digital cameras are more susceptible to this than others.
Automatic White Balance
Custom White Balance (Custom white balance uses an 18% gray card as a neutral reference.) A digital camera's auto white balance is often more effective when the photo contains at least one white or bright colorless element. Of course, do not try to change your composition to include a colorless object, but just be aware that its absence may cause problems with the auto white balance. Without the white boat in the image below, the camera's auto white balance mistakenly created an image with a slightly warmer color temperature.
IN MIXED LIGHTING Multiple illuminants with different color temperatures can further complicate performing a white balance. Some lighting situations may not even have a truly "correct" white balance, and will depend upon where color accuracy is most important.
Reference: Moon
Stone
Under mixed lighting, auto white balance usually calculates an average color temperature for the entire scene, and then uses this as the white balance. This approach is usually acceptable, however auto white balance tends to exaggerate the difference in color temperature for each light source, as compared with what we perceive with our eyes. Exaggerated differences in color temperature are often most apparent with mixed indoor and natural lighting. Critical images may even require a different white balance for each lighting region. On the other hand, some may prefer to leave the color temperatures as is. Note how the building to the left is quite warm, whereas the sky is somewhat cool. This is because the white balance was set based on the moonlight — bringing out the warm color temperature of the artificial lighting below. White balancing based on the natural light often yields a more realistic photograph. Choose "stone" as the white balance reference and see how the sky becomes unrealistically blue.
UNDERSTANDING CAMERA AUTOFOCUS A camera's autofocus system intelligently adjusts the camera lens to obtain focus on the subject, and can mean the difference between a sharp photo and a missed opportunity. Despite a seemingly simple goal— sharpness at the focus point—the inner workings of how a camera focuses are unfortunately not as straightforward. This tutorial aims to improve your photos by introducing how autofocus works—thereby enabling you to both make the most of its assets and avoid its shortcomings. Note: Autofocus (AF) works either by using contrast sensors within the camera (passive AF) or by emitting a signal to illuminate or estimate distance to the subject (active AF). Passive AF can be performed using either the contrast detection or phase detection methods, but both rely on contrast for achieving accurate autofocus; they will therefore be treated as being qualitatively similar for the purposes of this AF tutorial. Unless otherwise stated, this tutorial will assume passive autofocus. We will also discuss the AF assist beam method of active autofocus towards the end.
CONCEPT: AUTOFOCUS SENSORS A camera's autofocus sensor(s) are the real engine behind achieving accurate focus, and are laid out in various arrays across your image's field of view. Each sensor measures relative focus by assessing changes in contrast at its respective point in the image — where maximal contrast is assumed to correspond to maximal sharpness. Change Focus Amount: Blurred Partial Sharp
Sensor Histogram 400%
Please visit the tutorial on image histograms for a background on image contrast. Note: many compact digital cameras use the image sensor itself as a contrast sensor (using a method called contrast detection AF), and do not necessarily have multiple discrete autofocus sensors (which are more common using the phase detection method of AF).
Further, the above diagram illustrates the contrast detection method of AF; phase detection is another method, but this still relies on contrast for accurate autofocus. The process of autofocusing generally works as follows: (1) An autofocus processor (AFP) makes a small change in the focusing distance. (2) AFP reads the AF sensor to assess whether and by how much focus has improved. (3) Using the information from (2), the AFP sets the lens to a new focusing distance. (4) The AFP may iteratively repeat steps 2-3 until satisfactory focus has been achieved. This entire process is usually completed within a fraction of a second. For difficult subjects, the camera may fail to achieve satisfactory focus and will give up on repeating the above sequence, resulting in failed autofocus. This is the dreaded "focus hunting" scenario where the camera focuses back and forth repeatedly without achieving focus lock. This does not, however, mean that focus is not possible for the chosen subject. Whether and why autofocus may fail is primarily determined by factors in the next section.
FACTORS AFFECTING AUTOFOCUS PERFORMANCE The photographic subject can have an enormous impact on how well your camera autofocuses—and often even more so than any variation between camera models, lenses or focus settings. The three most important factors influencing autofocus are the light level, subject contrast and camera or subject motion.
An example illustrating the quality of different focus points has been shown to the left; move your mouse over this image to see the advantages and disadvantages of each focus location. Note that each of these factors are not independent; in other words, one may be able to achieve autofocus even for a dimly lit subject if that same subject also has extreme contrast, or vice versa. This has an important implication for your choice of autofocus point: selecting a focus point which corresponds to a sharp edge or pronounced texture can achieve better autofocus, assuming all other factors remain equal. In the example to the left we were fortunate that the location where autofocus performs best also corresponds to the subject location. The next example is more problematic because autofocus performs best on the background, not the subject. Move your mouse over the image below to highlight areas of good and poor performance.
In the photo to the right, if one focused on the fast-moving light sources behind the subject, one would risk an out-of-focus subject when the depth of field is shallow (as would be the case for a low-light action shot like this one). Alternatively, focusing on the subject's exterior highlight would perhaps be the best approach, with the caveat that this highlight would change sides and intensity rapidly depending on the location of the moving light sources. If one's camera had difficulty focusing on the exterior highlight, a lower contrast (but stationary and reasonably well lit) focus point would be the subject's foot, or leaves on the ground at the same distance as the subject.
What makes the above choices difficult, however, is that these decisions often have to be either anticipated or made within a fraction of a second. Additional specific techniques for autofocusing on still and moving subjects will be discussed in their respective sections towards the end of this tutorial.
NUMBER & TYPE OF AUTOFOCUS POINTS The robustness and flexibility of autofocus is primarily a result of the number, position and type of autofocus points made available by a given camera model. High-end SLR cameras can have 45 or more autofocus points, whereas other cameras can have as few as one central AF point. Two example layouts of autofocus sensors are shown below: Max f/#:
f/2.8
f/4.0
High-End SLR
f/5.6
f/8.0
f/2.8
f/4.0
f/5.6
Entry to Midrange SLR
Cameras used for left and right examples are the Canon 1D MkII and Canon 20D, respectively. For these cameras autofocus is not possible for apertures smaller than f/8.0 and f/5.6.
Two types of autofocus sensors are shown: + cross-type sensors (two-dimensional contrast detection, higher accuracy) l vertical line sensors (one-dimensional contrast detection, lower accuracy) Note: The "vertical line sensor" is only called this because it detects contrast along a vertical line. Ironically, this type of sensor is therefore best at detecting horizontal lines. For SLR cameras, the number and accuracy of autofocus points can also change depending on the maximum aperture of the lens being used, as illustrated above. This is an important consideration when choosing a camera lens: even if you do not plan on using a lens at its maximum aperture, this aperture may still help the camera achieve better focus accuracy. Further, since the central AF sensor is almost always the most accurate, for off-center subjects it is often best to first use this sensor to achieve a focus lock (before recomposing the frame). Multiple AF points can work together for improved reliability, or can work in isolation for improved specificity, depending on your chosen camera setting. Some cameras also have an "auto depth of field" feature for group photos which ensures that a cluster of focus points are all within an acceptable level of focus.
AF MODE: CONTINUOUS & AI SERVO vs. ONE SHOT The most widely supported camera focus mode is one-shot focusing, which is best for still subjects. The one shot mode is susceptible to focus errors for fast moving subjects since it cannot anticipate subject motion, in addition to potentially also making it difficult to visualize these moving subjects in the viewfinder. One shot focusing requires a focus lock before the photograph can be taken. Many cameras also support an autofocus mode which continually adjust the focus distance for moving subjects. Canon cameras refer to this as "AI Servo" focusing, whereas Nikon cameras refer to his as
"continuous" focusing. It works by predicting where the subject will be slightly in the future, based on estimates of the subject velocity from previous focus distances. The camera then focuses at this predicted distance in advance to account for the shutter lag (the delay between pressing the shutter button and the start of the exposure). This greatly increases the probability of correct focus for moving subjects. Example maximum tracking speeds are shown for various Canon cameras below:
Values are for ideal contrast and lighting, and use the Canon 300mm f/2.8 IS L lens. The above plot should also provide a rule of thumb estimate for other cameras as well. Actual maximum tracking speeds also depend on how erratic the subject is moving, the subject contrast and lighting, the type of lens and the number of autofocus sensors being used to track the subject. Also be warned that using focus tracking can dramatically reduce the battery life of your camera, so use only when necessary.
AUTOFOCUS ASSIST BEAM Many cameras come equipped with an AF assist beam, which is a method of active autofocus that uses a visible or infrared beam to help the autofocus sensors detect the subject. This can be very helpful in situations where your subject is not adequately lit or has insufficient contrast for autofocus, although the AF assist beam also comes with the disadvantage of much slower autofocus. Most compact cameras use a built-in infrared light source for the AF assist, whereas digital SLR cameras often use either a built-in or external camera flash to illuminate the subject. When using a flash for the AF assist, the AF assist beam may have trouble achieving focus lock if the subject moves appreciably between flash firings. Use of the AF assist beam is therefore only recommended for still subjects.
IN PRACTICE: ACTION PHOTOS Autofocus will almost always perform best with action photos when using the AI servo or continuous modes. Focusing performance can be improved dramatically by ensuring that the lens does not have to search over a large range of focus distances.
Perhaps the most universally supported way of achieving this is to pre-focus your camera at a distance near where you anticipate the moving subject to pass through. In the biker example to the right, one could pre-focus near the side of the road since one would expect the biker to pass by at near that distance. Some SLR lenses also have a minimum focus distance switch; setting this to the greatest distance possible (assuming the subject will never be closer) can also improve performance. Be warned, however, that in continuous autofocus mode shots can still be taken even if the focus lock has not yet been achieved.
IN PRACTICE: PORTRAITS & OTHER STILL PHOTOS Still photos are best taken using the one-shot autofocus mode, which ensures that a focus lock has been achieved before the exposure begins. The usual focus point requirements of contrast and strong lighting still apply, although one needs to ensure there is very little subject motion. For portraits, the eye is the best focus point—both because this is a standard and because it has good contrast. Although the central autofocus sensor is usually most sensitive, the most accurate focusing is achieved using the off-center focus points for off-center subjects. If one were to instead use the central AF point to achieve a focus lock (prior to recomposing for an off-center subject), the focus distance will always be behind the actual subject distance—and this error increases for closer subjects. Accurate focus is especially important for portraits because these typically have a shallow depth of field.
Since the most common type of AF sensor is the vertical line sensor, it may also be worth considering whether your focus point contains primarily vertical or horizontal contrast. In low-light conditions, one may be able to achieve a focus lock not otherwise possible by rotating the camera 90° during autofocus. In the example to the left, the stairs are comprised primarily of horizontal lines. If one were to focus near the back of the foreground stairs (to maximize apparent depth of field using the hyperfocal distance), one could avoid a failed autofocus by first orienting their camera in landscape mode during autofocus. Afterwards one could rotate the camera back to portrait orientation during the exposure, if so desired. Note that the emphasis in this tutorial has been on *how* to focus — not necessarily *where* to focus. For further reading on this topic please visit the tutorials on depth of field and the hyperfocal distance.