It’s All a Blur: Deer Creek Fall

Gary Hart Photography: On the Rocks, Deer Creek Fall, Grand Canyon

On the Rocks, Deer Creek Fall, Grand Canyon
Sony a7RIII
Sony 16-35 f/2.8 GM
Breakthrough 6-stop ND
5 seconds
F/11
ISO 100

Do I have a favorite place to in the Grand Canyon? Difficult to say, but I definitely have a shortlist, and Deer Creek Fall is on it. Of course this beautiful waterfall is right on the river and far from a secret, so it’s often overrun by other rafting trips (in general the bottom of the Grand Canyon is wonderfully not crowded, but people do tend to congregate at certain spots). Having done this trip six times now, I (with the help of my guides) have learned the timing to minimize or eliminate the people at these popular spots.

This year we found one other group enjoying Deer Creek Fall, but it wasn’t long before they pushed off and we had it to ourselves. In addition to many nice views at river level, there are some great scenes above the fall. The trail to the slot canyon that feeds the fall is steep, with a few spots that require a little non-technical climbing to get to the next level, but the payoff makes the effort worth it. The view of the river and Grand Canyon is (cliché alert) breathtaking, and from there a short trek through a beautiful slot canyon opens to an emerald oasis called “The Patio.” I’ve made the hike to the Patio once, but was kind of unnerved by a 20-foot stretch of 2-foot wide trail carved into a vertical wall and vowed not to do it again (give me something to hold onto and I could stand on top of Mt. Everest, but Alex Honnold I’m not).

Despite the threat of rain, I joined a handful of hikers in my group who followed a couple of our guides through the creek and up the trail. My plan was to stick with them to the view right before the slot canyon entrance, but after stopping briefly to photograph this scene, I climbed up nearby rocks to chase the group and immediately found more photo-worthy scenes overlooking the fall. The cloud cover created such wonderful light, I decided to forego the hike in favor of new photo opportunities.

After 30-minutes photographing Deer Creek Fall from a series of elevated ledges, I scrambled back down to my original river level scene. I’d rushed it earlier and wanted more quality time here. I worked it for another half hour before moving to other views of the fall. Here I used a 5-second exposure that blurred the water in the fall and nearby cascade, and which also captured a small swirl of foam near the rocks.

Full disclosure: My shutter speed options were limited by the fact that I’d departed for this trip thinking that my 82mm polarizer was in its normal place, affixed to my Sony 16-35 GM lens, but it turned out that what I thought was a polarizer was actually my Breakthrough 6-stop neutral density polarizing filter—the polarizer was in a pocket back home. Oops. So to get the polarization I wanted, I had no choice but to use the ND filter, which prevented me from capturing anything but extreme motion blur.

Workshop Schedule || Purchase Prints


Here’s my just-updated Photo Tips article on photographing motion


Motion

Fern Cascade, Russian Gulch Fall, Russian Gulch State Park (Mendocino), California

Fern Cascade, Russian Gulch Fall, Russian Gulch State Park (Mendocino), California

True story

I once had a photographer tell me that he didn’t like blurred water images because they’re “not natural.” The conversation continued something like this:

Me: “So how would you photograph that waterfall?”

Misguided Photographer: “I’d use a fast shutter speed to freeze the water.”

Me: “And you think that’s more natural than blurred water?”

Misguided Photographer: “Of course.”

Me: “And how many times have you seen water droplets frozen in midair?”

Misguided Photographer: “Uhhh….”

Photographic reality

The truth is, “natural” is a target that shifts with perspective. Humans experience the world as a 360 degree, three-dimentional, multi-sensory reel that unfolds in an infinite series of connected instants that our brain seamlessly processes as quickly as it comes in. But the camera discards 80 percent of the sensory input, limits the view to a rectangular box, and compresses those connected instants into a single, static frame. In other words, it’s impossible for a camera to duplicate human reality—the sooner photographers get that, the sooner they can get to work on expressing the world using their camera’s very different but quite compelling reality.

Despite the creative opportunities in their hands (or on their tripod), many photographers expend a great deal of effort trying to force their cameras closer to human reality (HDR, focus blending, and so on)—not inherently wrong, but in so doing they miss opportunities to reveal overlooked aspects of our complex natural world. Subtracting the distractions from the non-visual senses, controlling depth of focus, and banishing unwanted elements to the world outside the frame, a camera can distill a scene to its overlooked essentials, offering perspectives that are impossible in person.

Motion

A still image can’t display actual motion, but it can convey the illusion of motion that, among other things, frees the viewer’s imagination and establishes the scene’s mood. While nothing like our experience of the world, a camera can freeze the extreme chaos of a single instant, or combine a series of instants into a blur that conveys a pattern of motion.

Combining creative vision and technical skill, a photographer chooses where on the continuum that connects these extremes of motion will fall: The sudden drama of a crashing wave, or the soothing calm of soft surf; the explosive power of a plunging river, or the silky curves of tumbling cascades. Or perhaps someplace in the midrange of the motion continuum, stopping the action enough that discrete elements stand out, but not so much that a sense of flow is lost.

Blurred water

One question I’m quite frequently asked is, “How do I blur water?” And while there’s no magic formula, no shutter speed threshold beyond which all water blurs, blurring water isn’t that hard (as long as you use a tripod). In fact, when you photograph in the full shade or cloudy sky conditions I prefer, it’s usually more difficult to freeze moving water than it is to blur it (which is why I have very few images of water drops suspended in midair).

In addition to freezing motion or revealing a pattern of motion, an often overlooked opportunity is the smoothing effect a long exposure has on choppy water. I photograph at a lot of locations known for their reflections, but sometimes I arrive to find a wind has stirred the water into a disorganized, reflection-thwarting frenzy. In these situations a long exposure can smooth the chop, allowing the reflection to come through. Rather than the mirror reflection I came for, I get an ethereal, gauzy effect that still captures the reflection’s color and shape.

The amount of water motion blur you get depends on several variables:

  • The water’s speed—the faster the water, and (especially) the more whitewater (green water, no matter how fast it’s moving, doesn’t usually display a lot of motion blur), the greater the blur
  • Your focal length—the longer the focal length, the greater the blur
  • Your distance from the water—the closer the water, the greater the blur
  • And of course, the shutter speed—the longer your shutter is open, the greater the blur

Of these variables, it’s shutter speed that gets the most attention. That’s because focal length and subject distance are compositional considerations, and we usually don’t start thinking about blurring the water until after we have our composition. (This is as it should be—when composition doesn’t trump motion, the result is often a gimmicky image without much soul.)

You have several tools at your disposal for reducing the light reaching your sensor (and thereby lengthening your shutter speed), each with its advantages and disadvantages:

  • Don’t even think about any kind of subject blur without a sturdy tripod. For help selecting the right tripod, read the Tripod Selection article in my Photo Tips section.
  • Reducing ISO: Since you’re probably already at your camera’s native ISO (usually 100), this option often isn’t available. Some cameras allow you to expand the ISO below the native value, usually down to ISO 50. That extra stop of shutter duration you gain comes with a (very) slight decrease in image quality—most obvious to me as about 1/3 stop of dynamic range lost.
  • Shrinking your aperture (larger f-stop value): A smaller aperture also buys you more depth of field, but it also increases diffraction. Also, lenses tend to be less sharp at their most extreme apertures, so as a general rule, I resist going with an aperture smaller than f11 unless it’s necessary. That said, I often find myself shooting at f16 (and only very rarely smaller), but it’s always a conscious choice after eliminating all other options  for reducing light (or it’s a mistake, something I’m not immune to).
  • Adding a polarizing filter: In addition to reducing reflections, a polarizer will subtract 1 to 2 stops of light (depending on its orientation). When using a polarizer you need to be vigilant about orienting it each time you recompose (especially if you change your camera’s horizontal/vertical orientation), and monitoring its effect on the rest of your scene.
  • Adding a neutral density filter: A neutral density filter is, as its name implies, both neutral and dense. Neutral in that it doesn’t alter the color of your image; dense in that it cuts the amount of light reaching your sensor. While a dark enough ND filter might allow you to blur water on even the brightest of days, it does nothing for the other problems inherent to midday, full sunlight shooting. ND filters come in variable and fixed-stop versions—the flexibility of variable NDs (the ability to dial the amount of light up and down) means living with the vignetting they add to my wide angle images.
Gary Hart Photography: Before the Sun, South Tufa, Mono Lake

Before Sunrise, South Tufa, Mono Lake
Here a 3-second exposure smoothed a wind-induced chop and restored the reflection.

Because blurring water depends so much on the amount of light reaching your sensor, I can’t emphasize too much the importance of actually understanding metering and exposure, and how to manage the zero-sum relationship between shutter speed, aperture (f-stop), and ISO.

Read my Exposure basics Photo Tips article

Bracketing for motion

Back in the film days, we used to bracket (multiple clicks of the same scene with minor adjustments) for exposure. But in today’s world of improved dynamic range and pre- and post-capture histograms, exposure bracketing is (or at least should be) limited to photographers who blend multiple exposures. Today I only bracket for scene changes that will give me a variety of images to choose between later.

Often my scene bracketing is for depth of field, as I run a series of clicks with a range of f-stops, then decide later whether I want a little or a lot of DOF. But my most frequent use of scene bracketing is to capture a variety of water motion effects. I start by finding a composition I like, then adjust my shutter speed (compensating for the exposure change with ISO and/or f-stop changes) to get different motion blur.

River and stream whitewater is usually (but not always) fairly constant, so my adjustments are usually just to vary the amount of motion blur. But when I’m photographing waves, the timing of the waves is as important as the motion blur. It helps to stand back and observe the waves for a while to get a sense for any patterns. Watching the direction of the waves and the size of the approaching swells not only allows me to time my exposures more efficiently, it also keeps me safe (and dry).

Star motion

Few images validate the power of the camera’s unique vision better than a scene etched with the parallel arcs of rotating stars (yes, I know it’s not actually the stars that are rotating). Nothing like human reality, the camera’s view of the night sky is equal parts beautiful and revealing. (Can you think of a faster, more effective way to demonstrate Earth’s rotation than a star trail image?)

Here are the factors that determine the amount of stellar motion:

  • Exposure duration: The longer your shutter is open, the more motion your sensor captures.
  • Focal length: Just as it is with terrestrial subjects, a longer focal length shrinks the range of view and magnifies the stars that remain.
  • Direction of composition: Compositions aimed toward the North or South Poles will display less star motion than compositions aimed toward the celestial equator. That’s because, due to Earth’s rotation on its axis (an imaginary, infinite line skewering our North and South Poles), everything in the sky rotates 360 degrees in, around the poles, in 24 hours. But even though every star rotates the same number of degrees in any given period (seconds, minutes, hours, or whatever), the stars travel different distances in that same period: The farther a star is from the axis of rotation (the North or South Pole), the more visual distance covers to complete its circuit (it appears to move faster). This is clear to see when you realize that the farther a star is from one of the poles, the longer its arc is in a star trail image.
Star Trails, Desert View, Grand Canyon

Star Trails, Desert View, Grand Canyon National Park

As with water motion, you can choose between a long exposure that exaggerates stellar motion, or a shorter exposure that freezes the stars in place to display a more conventional night sky (albeit with more stars than our eyes can discern).

Read more in my Starlight photography Photo Tips article

Freezing motion

The other end of the motion continuum is stopping it in its tracks with an exposure of extremely short duration. Sometimes simply to avoid blurring something that should be stationary, like flowers or leaves. But just as a long exposure can blur water to reveal patterns in its motion that aren’t visible to the unaided eye, using a short exposure to freeze a fast moving or ephemeral subject freezing can reveal detail that happens too fast for the unaided eye to register.

Stopping motion in an image often requires exposure compromises, such as a larger than ideal aperture or ISO, or removing a polarizer. In my landscape world, f-stop rules all, so I won’t compromise my f-stop unless it’s truly irrelevant—for example, when everything in the scene is at infinity at all f-stops. And I’m reluctant to remove a polarizer because its effect, even when small, can’t be duplicated in Photoshop. Fortunately, compromising ISO is relatively painless given today’s digital cameras stellar high ISO capabilities.

Wind-blown leaves, breaking surf, and plummeting waterfalls examples of detail that can be frozen in the act, but my favorite example of an instant frozen in time is a lightning strike. Lightning comes and goes so fast that the human experience of it is always just a memory—it’s gone before we register its existence. Read how to photograph lighting in my Lightning article in the Photo Tips section of my blog.

This slideshow requires JavaScript.

So what’s the point?

In the static world of a photograph, it’s up to the photographer to  to create a sense of motion. Sometimes we achieve this with lines that lead the eyes through the scene, but even more powerful is an image that uses motion to tap its viewers imagination. Whether it’s freezing an instant, or connecting a series of instants in a single frame, the way you handle motion in your scene is a creative choice that’s enabled by your creative vision and technical skill.

Workshop Schedule || Purchase Prints


A gallery of motion

Click an image for a closer look, and a slide show. Refresh the screen to reorder the display.

 

The Evolution of a Stargazer

Gary Hart Photography: Dark Sky, Milky Way Above the Colorado River, Grand Canyon

Dark Sky, Milky Way Above the Colorado River, Grand Canyon
Sony a7SII
Sony 24 f/1.4 GM
20 seconds
F/1.4
ISO 6400

In the Beginning

I grew up in a camping family. My dad was a minister, so pricey airline/hotel/restaurant vacations were out of the question for the five of us, as of course were weekend camping trips. But for as far back as I can remember, each summer my family went camping somewhere. Usually it was a week or two in Yosemite, Sequoia/Kings Canyon, the California coast, or some other relatively close scenic destination, but every few years we’d hook up the tent trailer, pile into the station wagon, and take a road trip.

The one constant in this numbing succession of summer campsites was the dark sky far from city lights, and the vast sprinkle of stars that mesmerized me. I soon learned that stargazing is the one thing a child can do for as long as he wants after bedtime without getting in trouble. I enjoyed playing connect-the-dots with the stars, identifying named constellations, or making up my own. It turned out all this scanning was a great way to catch shooting stars, and soon my goal was to stay awake until one flashed across my vision. And satellites were still something of a novelty back then, so another camping bedtime exercise was to slowly scan the sky looking for a “star” that moved; when I found one, I’d track it across the until it disappeared behind the horizon—or my eyelids.

At some point I became aware of a hazy band of light stretching across my night sky. On the darkest nights, when my vantage point faced the right direction, the widest and brightest part of this band reminded me of sugar spilled on pooled ink. But the Milky Way wasn’t as dramatic some of the other stuff in my night skies, so the childhood Me was oblivious to its inherent coolness for many years.

On these nightly scans I was more interested in the apparent randomness in the patterns overhead—the consistency of certain stellar arrangements, while a few bright “stars” would be in different positions each night relative to these recognizable patterns. Someone explained to me the difference between stars and planets, that stars were far and planets were close, and that was good enough for me. For a while.

Then, when I was about ten, my best friend and I did a science project on comets, which ignited a sudden and intense interest in all things astronomical. I was gifted a second-hand telescope by a friend of my dad, which we’d set up in my best friend’s front yard on summer nights. Through the telescope the stars remained (boring) points of light, no matter how much I magnified them, but the planets became fascinating disks, each with its own personality. I learned that Venus and Mercury were actually crescents of varying size, just like a mini moon. After searching in vain for the canals on Mars, I was thrilled to (barely) see Saturn’s rings, and to watch the nightly dance of the four pin-prick Galilean moons.

All this stargazing helped me develop a rudimentary understanding of celestial relationships, the vastness of space, the sun’s dominant role in our solar system, and its utter insignificance in the Universe. And the more I learned about astronomy, the more fascinating our home galaxy became. Rather than just passively observing it, the Milky Way became a catalyst for pondering the mysteries of the Universe and my favorite night sky feature.

Fast forward…

Then came college, marriage, family, jobs, cameras (lots of cameras) until I found myself at the bottom of the Grand Canyon on this moonless night in May. It was the second night of my annual Grand Canyon Raft Trip for Photographers, a highlight in a year full of highlights, and my first opportunity each year to reconnect with my favorite celestial feature. After night one hadn’t worked out, I told myself that we still had four more chances, but at bedtime on night two I was a little more pessimistic.

The prescription for a successful Milky Way photograph includes a clear view of the southern sky with a nice foreground. There’s no shortage of foreground in the Grand Canyon, but southern sky views are not quite so plentiful. The first night had been spectacularly clear, but our otherwise spectacular campsite was on an east/west trending section of river (I try to select each campsite for its astrophotography potential, but the sites can’t be reserved, and sometime there are other factors to consider), which placed the rising galactic core behind a towering canyon wall. On our second day we’d scored prime real estate on a north/south section of river a few miles upstream from Desert View, but now thin clouds threatened to spoil the show.

In May the Milky Way doesn’t usually crest the canyon walls until 2:00 or 3:00 a.m. (depending on the location), but as we prepared for bed that second day, only a handful of stars smoldered in the gauzy veil above. But with six hours for conditions to improve, I prepared anyway, identifying my foreground, setting up my tripod next to my cot, and mounting my Sony a7SII body and Sony 24mm f/1.4 lens with ISO, f-stop, and shutter speed set.

Waking a little before 3:00, I instantly saw far more stars than had been visible at bedtime. But more importantly, there was the Milky Way, directly overhead. I sat up and peered toward the river—the soft glow of several LCD screens told me others were already shooting, so I grabbed my tripod and stumbled down to the river’s edge in the dark (to avoid illuminating the others’ scene). It’s quite amazing how well you can see by the light of the Milky Way once your eyes adjust.

After a few frames I saw that a few thin clouds remained, creating interesting patterns against the starry background. By about 4 a.m., an hour-and-a-half before sunrise, loss of contrast in my images that wasn’t visible to my eyes told me the approaching sun was already starting to brighten the sky. I photographed for about an hour that morning, then managed to catch another 45 minutes of contented sleep before the guides’ coffee call got me up for good.

Workshop Schedule || Purchase Prints


I continue updating my Photo Tips articles—here’s my just-updated Milky Way article,

with all you need to know to locate and photograph our home galaxy


How to photograph the Milky Way

This slideshow requires JavaScript.

See the Milky Way

Look heavenward on a moonless (Northern Hemisphere) summer night far from city light. The first thing to strike you is the shear volume of stars, but as your eyes adjust, your gaze is drawn to a luminous band spanning the sky. Ranging from magnificently brilliant to faintly visible, this is the Milky Way, home to our sun and nearly a half trillion other stars of varying age, size, and temperature.

Size and shape

Though every star you’ve ever seen is part of our Milky Way galaxy, stargazers use the Milky Way label more specifically to identify this river of starlight, gas, and dust spanning the night sky. As you feast your eyes, appreciate that some of the Milky Way’s starlight has traveled 25,000 years to reach your eyes, and light from a star on one edge of the Milky Way would take 100,000 years to reach the other side.

Spiral Galaxy (Milky look-alike): This is what our galaxy would look like from above.

Milky Way look-alike spiral galaxy: This is what our galaxy would look like from the outside, looking in. (The individual stars visible here are “local” and not part of the spiral galaxy depicted here.) Earth would be between two of the spiral arms, about halfway out from the center.

The rest of the sky appears to be filled with far more discrete stars than the region containing the Milky Way, but don’t let this deceive you. Imagine that you’re out in the countryside where the lights of a distant city blend into a homogeneous glow—similarly, the stars in the Milky Way’s luminous band are simply too numerous and distant to resolve individually. On the other hand, the individual pinpoints of starlight that we name and mentally assemble into constellations are just closer, much like the lights of nearby farmhouses. And the dark patches in the Milky Way aren’t empty space—like the trees and mountains that block our view of the city, they’re starlight-blocking interstellar dust and gas, remnants of exploded stars and the stuff of future stars.

Just as it’s impossible to know what your house looks like by peering out a window, it’s impossible to know what the Milky Way looks like by simply looking up on a dark night. Fortunate for us, really smart people have been able to infer from painstaking observation, measurement, reconstruction, and comparison with other galaxies that our Milky Way is flat (much wider than it is tall) and spiral shaped, like a glowing pinwheel, with two major arms and several minor arms spiraling out from its center. Our solar system is in one of the Milky Way’s minor arms, a little past midway between the center and outer edge.

Blinded by the light

Sadly, artificial light and atmospheric pollution have erased the view of the Milky Way for nearly a third of the world’s population, and eighty percent of Americans. Worse still, even though some part of the Milky Way is overhead on every clear night, many people have never seen it.

Advances in digital technology have spurred a night photography renaissance that has enabled the Milky Way challenged to enjoy images of its splendor from the comfort of their recliner, but there’s nothing quite like viewing it in person. With just a little knowledge and effort, you too can enjoy the Milky Way firsthand; add the right equipment and a little more knowledge, and you’ll be able to photograph it as well.

Horizon to Horizon

Understanding that our Solar System is inside the Milky Way’s disk makes it easier to understand why we can see some portion of the Milky Way on any night (assuming the sky is dark enough). In fact, from our perspective, the plane of the Milky Way forms a complete ring around Earth (but of course we can only see half the sky at any given time), with its brightness varying depending on whether we’re looking toward our galaxy’s dense center or sparse outer region.

Where the action is

Milky Way and Halemaʻumaʻu Crater, Kilauea, Hawaii

The Milky Way’s brilliant center, its “galactic core,” radiates above Kilauea on Hawaii’s Big Island

Though the plane of the Milky Way stretches all the way across our sky, when photographers talk about photographing the Milky Way, they usually mean the galactic core—the Milky Way’s center and most densely packed, brightest region. Unfortunately, our night sky doesn’t always face the galactic core, and there are many months when this bright region is not visible at all.

To understand the Milky Way’s visibility in our night sky, it helps to remember that Earth both rotates on its axis (a day), and revolves around the sun (a year). When the side of the planet we’re on rotates away from the sun each day, the night sky we see is determined by our position on our annual trip around the sun—when Earth is between the sun and the galactic core, we’re in position to see the most brilliant part of the Milky Way; in the months when the sun is between earth and the galactic core, the bright part of the Milky Way can’t be seen.

Put in terrestrial terms, imagine you’re at the neighborhood playground, riding a merry-go-round beneath a towering oak tree. You face outward, with your back to the merry-go-round’s center post. As the merry-go-round spins, your view changes—about half of the time you’d rotate to face the oak’s trunk, and about half the time your back is to the tree. Our solar system is like that merry-go-round: the center post is the sun, the Milky Way is the tree, and in the year it takes our celestial merry-go-round to make a complete circle, we’ll face the Milky Way about half the time.

Finding the Milky Way

Just like every other celestial object outside our solar system, the Milky Way’s position in our sky changes with the season and time of night you view it, but it remains constant relative to the other stars and constellations. This means you can find the Milky Way by simply locating any of the constellations in the galactic plane. Here’s an alphabetical list of the constellations* through which the Milky Way passes (with brief notes by a few of the more notable constellations):

  • Aquila
  • Ara
  • Auriga—faintest
  • Canis Major—faint
  • Carina
  • Cassiopeia—faint; its easily recognized “w” (or “m”) shape makes Cassiopeia a good landmark for locating the Milky Way in the northern sky
  • Cepheus
  • Circinus
  • Crux
  • Cygnus—bright
  • Gemini
  • Lacerta
  • Lupus
  • Monoceros
  • Musca
  • Norma
  • Ophiuchus
  • Orion—faint; another easy to recognize constellation that’s good for finding the galactic plane
  • Perseus—faint
  • Puppis
  • Pyxis
  • Sagitta
  • Sagittarius—brightest, galactic core
  • Scorpius—bright
  • Scutum
  • Serpens
  • Taurus—faint
  • Triangulum
  • Vela
  • Vulpecula
* Constellations are comprised of stars that only appear connected by virtue of our Earth-bound perspective—a constellation is a direction in the sky, not a location in space.

If you can find any of these constellations, you’re looking in the direction of some part of the Milky Way (if you can’t see it, your sky isn’t dark enough). But most of us want to see the center of the Milky Way, where it’s brightest, most expansive, and most photogenic. The two most important things to understand about finding the Milky Way’s brilliant center are:

  • From our perspective here on Earth, the galactic core is in Sagittarius (and a couple of other constellations near Sagittarius)—when Sagittarius is visible, so is the brightest part of the Milky Way (assuming you can find a dark enough sky)
  • Earth’s night side most directly faces Sagittarius in the Northern Hemisphere’s summer months (plus part of spring and autumn)

Armed with this knowledge, locating the Milky Way’s core is as simple as opening one of my (too many) star apps to find out where Sagittarius is. Problem solved. Of course it helps to know that the months when the galactic core rises highest and is visible longest are June, July, and August, and to not even consider looking before mid-March, or after mid-October. If you can’t wait until summer and don’t mind missing a little sleep, starting in April, Northern Hemisphere residents with a dark enough sky can catch Sagittarius and the galactic core rising in the southeast shortly before sunrise. After its annual premier in April, the Milky Way’s core rises slightly earlier each night and is eventually well above the horizon by nightfall.

People who enjoy sleep prefer doing their Milky Way hunting in late summer and early autumn, when the galactic core has been above the horizon for most of the daylight hours, but remains high in the southwest sky as soon as the post-sunset sky darkens enough for the stars to appear. The farther into summer and autumn you get, the closer to setting beneath the western horizon the Milky Way will be at sunset, and the less time you’ll have before it disappears.

Into the darkness

The Milky Way is dim enough to be easily washed out by light pollution and moonlight, so the darker your sky, the more visible the Milky Way will be. To ensure sufficient darkness, I target moonless hours, from an hour or so after sunset to an hour before sunrise. New moon nights are easiest because the new moon rises and sets (more or less) with the sun and there’s no moon all night. But on any night, if you pick a time before the moon rises, or after it sets, you should be fine. Be aware that the closer the moon is to full, the greater the potential for its glow to leak into the scene from below the horizon.

Getting away from city lights can be surprisingly difficult (and frustrating). Taking a drive out into the countryside near home is better than nothing, and while it may seem dark enough to your eyes, a night exposure in an area that you expect to be dark enough reveals just how insidious light pollution is as soon as you realize all of your images are washed out by an unnatural glow on the horizon. Since the galactic core is in the southern sky in the Northern Hemisphere, you can mitigate urban glow in your Milky Way images by heading south of any nearby population area, putting the glow behind you as you face the Milky Way.

Better than a night drive out to the country, plan a trip to a location with a truly dark sky. For this, those in the less densely populated western US have an advantage. The best resource for finding world-class dark skies anywhere on Earth is the International Dark-Sky Association. More than just a resource, the IDA actively advocates for dark skies, so if the quality of our night skies matters to you, spend some time on their site, get involved, and share their website with others.

Photograph the Milky Way

This slideshow requires JavaScript.

Viewing the Milky Way requires nothing more than a clear, dark sky. (Assuming clean, clear skies) the Milky Way’s luminosity is fixed, so our ability to see it is largely a function of the darkness of the surrounding sky—the darker the sky, the better the Milky Way stands out. But because our eyes can only take in a fixed amount of light, there’s a ceiling on our ability to view the Milky Way with the unaided eye.

A camera, on the other hand, can accumulate light for a virtually unlimited duration. This, combined with technological advances that continue increasing the light sensitivity of digital sensors, means that when it comes to photographing the Milky Way, well…, the sky’s the limit. As glorious as it is to view the Milky Way with the unaided eye, a camera will show you detail and color your eyes can’t see.

Knowing when and where to view the Milky Way is a great start, but photographing the Milky Way requires a combination of equipment, skill, and experience that doesn’t just happen overnight (so to speak). But Milky Way photography doesn’t need to break the bank, and it’s not rocket science.

Equipment

Bottom line, photographing the Milky Way is all about maximizing your ability to collect light: long exposures, fast lenses, high ISO.

Camera

In general, the larger your camera’s sensor and photosites (the “pixels” that capture the light), the more efficiently it collects light. Because other technology is involved, there’s not an absolute correlation between sensor and pixel size and light gathering capability, but a small, densely packed sensor almost certainly rules out your smartphone and point-and-shoot cameras for anything more than a fuzzy snap of the Milky Way. At the very least you’ll want a mirrorless or DSLR camera with an APS-C (1.5/1.6 crop) size sensor. Better still is a full frame mirrorless or DSLR camera. (A 4/3 Olympus or Panasonic sensor might work, but as great as these cameras are for some things, high ISO photography isn’t their strength.

Another general rule is that the newer the technology, the better it will perform in low light. Even with their smaller, more densely packed sensors, many of today’s top APS-C bodies outperform in low light full frame bodies that have been out for a few years, so full frame or APS-C, if your camera is relatively new, it will probably do the job.

If you’re shopping for a new camera and think night photography might be in your future, compare your potential cameras’ high ISO capabilities—not their maximum ISO. Read reviews by credible sources like DP Review, Imaging Resource, or DxOMark (among many others) to see how your camera candidates fare in objective tests.

An often overlooked consideration is the camera’s ability to focus in extreme low light. Autofocusing on the stars or landscape will be difficult to impossible, and you’ll not be able to see well enough through a DSLR’s viewfinder to manually focus. Some bodies with a fast lens might autofocus on a bright star or planet, but it’s not something I’d count on (though I expect within a few years before this capability will become more common).

Having photographed for years with Sony and Canon, and working extensively with most other mirrorless and DSLR bodies in my workshops, I have lots of experience with cameras from many manufacturers. In my book, focus peaking makes mirrorless the clear winner for night focusing. Sony’s current mirrorless bodies (a7RII/RIII, a7S/SII) are by far the easiest I’ve ever used for focusing in the dark—what took a minute or more with my Canon, I can do in seconds using focus peaking with my Sony bodies (especially the S bodies). I use the Sony a7SII, but when I don’t want to travel with a body I only use for night photography, the Sony a7RIII does the job too. Of the major DSLR brands, I’ve found Canon’s superior LCD screen (as of 2019) makes it much easier to focus in extreme low light than Nikon. (More on focus later.)

Lens

Put simply, to photograph the Milky Way you want fast, wide glass—the faster the better. Fast to capture as much light as possible; wide to take in lots of sky. A faster lens also makes focus and composition easier because the larger aperture gathers more light. How fast? F/2.8 or faster—preferably faster. How wide? At least 28mm, and wider is better still. I do enough night photography that I have a dedicated, night-only lens—my original night lens was a Canon-mount Zeiss 28mm f/2; my current night lens is the Sony 24mm f/1.4.

Tripod

It goes without saying that at exposure times up to 30 seconds, you’ll need a sturdy tripod and head for Milky Way photography. You don’t need to spend a fortune, but the more you spend, the happier you’ll be in the long run (trust me). Carbon fiber provides the best combination of strength, vibration reduction, and light weight, but a sturdy (albeit heavy) aluminum tripod will do the job.

An extended centerpost is not terribly stable, and a non-extended centerpost limits your ability to spread the tripod’s legs and get low, so I avoid tripods with a centerpost. But if you have a sturdy tripod with a centerpost, don’t run out and purchase a new one—just don’t extend the centerpost when photographing at night.

Read my tips for purchasing a tripod here.

Other stuff

To eliminate the possibility of camera vibration I recommend a remote release; without a remote you’ll risk annoying all within earshot with your camera’s 2-second timer beep. You’ll want a flashlight or headlamp for the walk to and from the car, and your cell phone for light while shooting. And it’s never a bad idea to toss an extra battery in your pocket. And speaking of lights, never, never, NEVER use a red light for night photography (more on this later).

Getting the shot

Keep it simple

There are just so many things that can go wrong on a moonless night when there’s not enough light to see camera controls, the contents of your bag, and the tripod leg you’re about to trip over. After doing this for many years, both on my own and helping others in workshops, I’ve decided that simplicity is essential.

Simplicity starts with paring down to the absolute minimum camera gear: a sturdy tripod, one body, one lens, and a remote release (plus an extra battery in my pocket). Everything else stays at home, in the car, or if I’m staying out after a sunset shoot, in my bag.

Upon arrival at my night photography destination, I extract my tripod, camera, lens (don’t forget to remove the polarizer), and remote release. I connect the remote and mount my lens—if it’s a zoom I set the focal length at the lens’s widest—then set my exposure and focus (more on exposure and focus below). If I’m walking to my photo site, I carry the pre-exposed and focused camera on the tripod (I know this makes some people uncomfortable, but if you don’t trust your tripod head enough to hold onto your camera while you’re walking, it’s time for a new head), trying to keep the tripod as upright and stable as possible as I walk.

Flashlights/headlamps are essential for the walk/hike out to to and from my shooting location, but while I’m there and in shoot mode, it’s no flashlights, no exceptions. This is particularly important when I’m with a group. Not only does a flashlight inhibit your night vision, its light leaks into the frame of everyone who’s there. And while red lights may be better for your night vision and are great for telescope view, red light is especially insidious about leaking into everyone’s frame, so if you plan to take pictures, no red light! If you follow my no flashlight rule once the photography begins, you’ll be amazed at how well your eyes adjust. I can operate my camera’s controls in the dark—it’s not hard with a little practice, and well worth the effort to learn. If I ever do need to see my camera to adjust something, or if I need to see to move around, my cell phone screen (not the phone’s flashlight, just its illuminated screen) gives me all the light I need.

Composition

A good Milky Way image is distinguished from an ordinary Milky Way image by its foreground. Simply finding a location that’s dark enough to see the Milky Way is difficult enough; finding a dark location that also has a foreground worthy of pairing with the Milky Way usually takes a little planning.

Since the Milky Way’s center is in the southern sky (for Northern Hemisphere observers), I look for remote (away from light pollution) subjects that I can photograph while facing south (or southeast or southwest, depending on the month and time of night). Keep in mind that unless you have a ridiculous light gathering camera (like the Sony a7S or a7S II) and an extremely fast lens (f/2 or faster), your foreground will probably be more dark shape than detail. Water’s inherent reflectivity makes it a good foreground subject as well, especially if the water includes rocks or whitewater.

When I encounter a scene I deem photo worthy, not only do I try to determine its best light and moon rise/set possibilities, I also consider its potential as a Milky Way subject. Can I align it with the southern sky? Are there strong subjects that stand out against the sky? Is there water I can include in my frame?

I’ve found views of the Grand Canyon from the North Rim, the Kilauea Caldera, and the bristlecone pines in California’s White Mountains that work spectacularly. And its hard to beat the dark skies and breathtaking foreground possibilities at the bottom of the Grand Canyon. On the other hand, while Yosemite Valley has lots to love, you don’t see a lot of Milky Way images from Yosemite Valley because not only is there a lot of light pollution, and Yosemite’s towering, east/west trending granite walls give its south views an extremely high horizon that blocks much of the galactic core from the valley floor.

The last few years I’ve started photographing the Milky Way above the spectacular winter scenery of New Zealand’s South Island, where the skies are dark and the Milky Way is higher in the sky than it is in most of North America.

To maximize the amount of Milky Way in my frame, I generally (but not always) start with a vertical orientation that’s at least 2/3 sky. On the other hand, I do make sure to give myself more options with a few horizontal compositions as well. Given the near total darkness required of a Milky Way shoot, it’s often too dark to see well enough to compose that scene. If I can’t see well enough to compose I guess at a composition, take a short test exposure at an extreme (unusable) ISO to enable a relatively fast shutter speed (a few seconds), adjust the composition based on the image in the LCD, and repeat until I’m satisfied.

Focus

Needless to say, when it’s dark enough to view the Milky Way, there’s not enough light to autofocus (unless you have a rare camera/lens combo that can autofocus on a bright star and planet), or even to manually focus with confidence. And of all the things that can ruin a Milky Way image (not to mention an entire night), poor focus is number one. Not only is achieving focus difficult, it’s very easy to think you’re focused only to discover later that you just missed.

Because the Milky Way’s focus point is infinity, and you almost certainly won’t have enough light to stop down for more depth of field, your closest foreground subjects should be far enough away to be sharp when you’re wide open and focused at infinity. Before going out to shoot, find a hyperfocal app and plug in the values for your camera and lens at its widest aperture. Even though it’s technically possible to be sharp from half the hyperfocal distance to infinity, the kind of precise focus focusing on the hyperfocal point requires is difficult to impossible in the dark, so my rule of thumb is to make sure my closest subject is no closer than the hyperfocal distance.

For example, I know with my Sony 24mm f/1.4 wide open on my full frame Sony a7SII, the hyperfocal distance is about 50 feet. If I have a subject that’s closer (such as a bristlecone pine), I’ll pre-focus (before dark) on the hyperfocal distance, or shine a bright light on an object at the hyperfocal distance and focus there, but generally I make sure everything is at least 50 feet away. Read more about hyperfocal focus in my Depth of Field article.

By far the number one cause of night focus misses is the idea that you can just dial any lens to infinity; followed closely by the idea that focused at one focal length means focused at all focal lengths. Because when it comes to sharpness, almost isn’t good enough, if you have a zoom lens, don’t even think of trying to dial the focus ring to the end for infinity. And even for most prime lenses, the infinity point is a little short of all the way to the end, and can vary slightly with the temperature and f-stop. Of course if you know your lens well enough to be certain of its infinity point by feel (and are a risk taker), go for it. And that zoom lens that claims to be parfocal? While it’s possible that your zoom will hold focus throughout its entire focal range, regardless of what the manufacturer claims, I wouldn’t bet an entire shoot on it without testing first.

All this means that the only way to ensure night photography sharpness is to focus carefully on something before shooting, refocus every time your focal length changes, and check focus frequently by displaying and magnifying an image on your LCD. To simplify (there’s that word again), when using a zoom lens, I usually set the lens at its widest focal length, focus, verify sharpness, and (once I know I’m focused) never change the focal length again.

While the best way to ensure focus is to set your focal length and focus before it gets dark, sometimes pre-focusing isn’t possible, or for some reason you need to refocus after darkness falls. If I arrive at my destination in the dark, I autofocus on my headlights, a bright flashlight, or a laser 50 feet or more away. And again, never assume you’re sharp by looking at the image that pops up on the LCD when the exposure completes—always magnify your image and check it after you focus.

For more on focusing in the dark, including how to use stars to focus, read my Starlight Photo Tips article.

Exposure

Exposing a Milky Way image is wonderfully simple once you realize that you don’t have to meter—because you can’t (not enough light). Your goal is simply to capture as many photons as you can without damaging the image with noise, star motion, and lens flaws.

Basically, with today’s technology you can’t give a Milky Way image too much light—you’ll run into image quality problems before you overexpose a Milky Way image. In other words, capturing the amount of light required to overexpose a Milky Way image is only possible if you’ve chosen an ISO and/or shutter speed that significantly compromises the quality of the image with excessive noise and/or star motion.

In a perfect world, I’d take every image at ISO 100 and f/8—the best ISO and f-stop for my camera and lens. But that’s not possible when photographing in near total darkness—a usable Milky Way image requires exposure compromises. What kind of compromises? The key to getting a properly exposed Milky Way image is knowing how far you push your camera’s exposure settings before the light gained isn’t worth the diminished quality. Each exposure variable causes a different problem when pushed too far:

  • ISO: Raising ISO to increase light sensitivity comes with a corresponding increase in noise that muddies detail. The noise at any particular ISO varies greatly with the camera, so it’s essential to know your camera’s low-light capability(!). Some of the noise can be cleaned up with noise reduction software (I use Topaz DeNoise 6)—the amount that cleans up will depend on the noise reduction software you use, your skill using that software, and where the noise is (is it marring empty voids or spoiling essential detail?).
  • Shutter speed: The longer the shutter stays open, the more motion blur spreads the stars’ distinct pinpoints into streaks. I’m not a big fan of formulas that dictate star photography shutter speeds because I find them arbitrary and inflexible, and they fail to account for the fact that the amount of apparent stellar motion varies with the direction you’re composing (you’ll get less motion the closer to the north or south poles you’re aimed). My general shutter-speed rule of thumb is 30-seconds or less, preferably less—I won’t exceed 30 seconds, and do everything I can to get enough light with a faster shutter speed.
  • F-stop: At their widest apertures, lenses tend to lose sharpness (especially on the edges) and display optical flaws like comatic aberration (also called coma) that distorts points of light (like stars) into comet shaped blurs. For many lenses, stopping down even one stop from wide open significantly improves image quality.

Again: My approach to metering for the Milky Way is to give my scene as much light as I can without pushing the exposure compromises to a point I can’t live with. Where exactly is that point? Not only does that question require a subjective answer that varies with each camera body, lens, and scene, as technology improves, I’m less forgiving of exposure compromises than I once was. For example, when I started photographing the Milky Way with my Canon 1DS Mark III, the Milky Way scenes I could shoot were limited because my fastest wide lens was f/4 and I got too much noise when I pushed my ISO beyond 1600. This forced me compromise by shooting wide open with a 30-second shutter speed to achieve even marginal results. In fact, given these limitations, despite trying to photograph the Milky Way from many locations, when I started the only Milky Way foreground that worked well enough was Kilauea Caldera, because it was its own light source (an erupting volcano).

Today (mid-2019) I photograph the Milky Way with a Sony a7S II and a Sony 24mm f/1.4 lens. I get much cleaner images from my Sony at ISO 6400 than got a ISO 1600 on my Canon 1DSIII, and the night light gathering capability of an f/1.4 lens revelatory. At ISO 6400 (or higher) I can stop down slightly to eliminate lens aberrations (though I don’t seem to need to with the Sony lens), drop my shutter speed to 20 or 15 seconds to reduce star motion 33-50 percent, and still get usable foreground detail by starlight.

I can’t emphasize enough how important it is to know your camera’s and lens’s capabilities in low light, and how for you’re comfortable pushing the ISO and f-stop. For each of the night photography equipment combos I’ve used, I’ve established a general exposure upper threshold, rule-of-thumb compromise points for each exposure setting that I won’t exceed until I’ve reached the compromise threshold of the other exposure settings. For example, with my Sony a7SII/24mm f/1.4 combo, I usually start at ISO 6400, f/1.4, and 20 seconds. Those settings will usually get me enough light for Milky Way color and pretty good foreground detail. But if I want more light (for example, if I’m shooting into the black pit of the Grand Canyon from the canyon rim), my first exposure compromise might be to increase to ISO 12800; if I decide I need even more light, my next compromise is to bump my shutter speed to 30 seconds. Or if I want a wider field of view than 24mm, I’ll put on my Sony 16-35 f/2.8 G lens and increase to ISO 12800 and 30 seconds.

These thresholds are guidelines rather than hard-and-fast rules, and they apply to my preferences only—your results may vary. And though I’m pretty secure with this workflow, for each Milky Way composition I try a variety of exposure combinations before moving to another composition. Not only does this give me a range of options to choose between when I’m at home and reviewing my images on a big monitor, it also gives me more insight into my camera/lens capabilities, allowing me to refine my exposure compromise threshold points.

One other option that I’ve started applying automatically is long exposure noise reduction, which delivers a noticeable reduction in noise for exposures that are several seconds and longer.

* In normal situations the Sony a7SII can handle ISO 12,800 without even breathing hard, but the long exposure time required of night photography generates a lot of heat on the sensor with a corresponding increase in noise.

It’s time to click that shutter

You’re in position with the right gear, composed, focused, and exposure values set. Before you actually click the shutter, let me remind you of a couple of things you can do to ensure the best results: First, lower that center post. A tripod center post’s inherent instability is magnified during long exposures, not just by wind, but even by nearby footsteps, the press of the shutter button, and slap of the mirror (and sometimes it seems, by ghosts). And speaking of shutter clicks, you should be using a remote cable or two-second timer to eliminate the vibration imparted when your finger presses the shutter button.

When that first Milky Way image pops up on the LCD, it’s pretty exciting. So exciting in fact that sometimes you risk being lulled into a “Wow, this isn’t as hard as I expected” complacency. Even though you think everything’s perfect, don’t forget to review your image sharpness every few frames by displaying and magnifying and image on your LCD. In theory nothing should change unless you changed it, but in practice I’ve noticed an occasional inclination for focus to shift mysteriously between shots. Whether it’s slight temperature changes or an inadvertent nudge of the focus ring as you fumble with controls in the dark, you can file periodically checking your sharpness falls under “an ounce of prevention….” Believe me, this will save a lot of angst later.

And finally, don’t forget to play with different exposure settings for each composition. Not only does this give you more options, it also gives you more insight into your camera/lens combo’s low light capabilities.

The bottom line

Though having top-of-the-line, low-light equipment helps a lot, it’s not essential. If you have a full frame mirrorless or DSLR camera that’s less than five years old, and a lens that’s f/2.8 or faster, you probably have all the equipment you need to get great the Milky Way images. Even with a cropped sensor, or an f/4 lens, you have a good chance of getting usable Milky Way images in the right circumstances. If you’ve never photographed the Milky Way before, don’t expect perfection the first time out. What you can expect is improvement each time you go out as you learn the limitations of your equipment and identify your own exposure compromise thresholds. And success or failure, at the very least you’ll have spent a magnificent night under the stars.

Workshop Schedule || Purchase Prints


A Milky Way Gallery

Click an image for a closer look and slide show. Refresh the window to reorder the display.

Finding Focus in the Grand Canyon

Gary Hart Photography: Sky Reflection, Blacktail Canyon, Grand Canyon

Sky Reflection, Blacktail Canyon, Grand Canyon

I returned Friday from my annual Grand Canyon Raft Trip for Photographers and am playing catch-up on all aspects of my photography life. I’ve barely looked at the my raft trip images, but chose this one for a couple of reasons: first, because I think it perfectly conveys the intimate serenity that always catches me by surprise in this landscape known mostly for it’s broad vistas; and second, because it’s the only image I’ve processed so far.

This is Blacktail Canyon, one of hundreds (thousands?) of narrow slot canyons cutting into the Grand Canyon’s towering walls. Most of them we just float past, sometimes because of the physical challenges required to explore their depths, but usually because there just isn’t time to stop at every slot canyon. On my trips we pick our slots for their photo opportunities, and this year Blacktail Canyon was a particular highlight.

With tall, tightly spaced walls, Blacktail Canyon spends most of its daylight hours in full shade, ideal for photography on sunny days. It doesn’t always have water, but this year’s wet winter meant water in lots of places that don’t always get it. We found the little creek that splits the canyon carrying just enough water to create a series of reflective pools before disappearing into the stream bed, only to reappear farther downstream.

What first drew my eye to this scene was a tiny sapling sprouting from an overhanging ledge, but I soon realized that the tree would best serve me as a visuall element to hold the top of my frame rather than the primary subject. The most interesting thing, I decided, was the blue sky reflection like a jewel embedded in the creek bed.

To create this composition, I dropped my tripod to about a foot above the canyon floor and positioned myself so the lines connecting my primary focal points (the sky reflection, the pair of boulders, and the green tree) created a triangle. Fitting all this into the frame required a vertical orientation of my Sony a7RIII, using virtually the entire width of my Sony 16-35 f/2.8 G lens. Even at this wide focal length, the smooth pebbles at my feet were only about a foot away; getting both the nearby pebbles and glowing (from bounced sunlight) sandstone above the tree sharp, meant choosing my exposure settings and focus point very carefully. My hyperfocal app told me that at f/16, by focusing two feet away, I could achieve my sharpness goal. Watching the rapidly changing sky, I timed my click for the best blend of clouds and sky filling the reflection.


To better understand focus technique, below is an updated version of my Depth of Field article from my Photo Tips section


Finding Focus

What’s the point?

It seems like one of photography’s great mysteries is achieving proper focus: the camera settings, where to place the focus point, even the definition of sharpness are all sources of confusion. If you’re a tourist just grabbing snapshots, everything in your frame is likely at infinity and you can just put your camera in full auto mode and click away. But if you’re a photographic artist trying to capture something unique with your mirrorless or DSLR camera and doing your best to have important visual elements objects at different distances throughout your frame, you need to stop letting your camera decide your focus point and exposure settings.

Of course the first creative focus decision is whether you even want the entire frame sharp. While some of my favorite images use selective focus to emphasize one element and blur the rest of the scene, most (but not all) of what I’ll say here is about using hyperfocal techniques to maximize depth of field (DOF). I cover creative selective focus in much greater detail in another Photo Tip article: Creative Selective Focus.

Beware the “expert”

I’m afraid that there’s some bad, albeit well-intended, advice out there that yields just enough success to deceive people into thinking they’ve got focus nailed, a misperception that often doesn’t manifest until an important shot is lost. I’m referring to the myth that you should focus 1/3 of the way into the scene, or 1/3 of the way into the frame (two very different things, each with its own set of problems).

For beginners, or photographers whose scene doesn’t include subjects from near to far, the 1/3 technique may be a useful rule of thumb. But taking the 1/3 approach to focus requires that you understand DOF and the art of focusing well enough to know when 1/3 won’t work, and how to adjust your focus point and settings. And once you achieve that level of understanding, you may as well do it the right way from the start. Focus control becomes especially important in those scenes where missing the focus point by just a few feet or even inches can make or break and image.

Where to focus this? Of course 1/3 of the way into a scene that stretches for miles won’t work. And 1/3 of the way into a frame with a diagonal foreground won’t work either.

Back to the basics

Understanding a few basic focus truths will help you make focus decisions:

  • A lens’s aperture is the opening that allows light to reach your sensor—the bigger this opening, the more light gets in, but also the smaller your DOF.
  • Aperture is measured in f-stops, which is the lens’s focal length divided by the aperture’s diameter; the higher the f-number, the smaller the aperture and the greater the DOF. So f/8 is actually a bigger aperture (with less DOF) than f/11. This understanding becomes second nature, but if you’re just learning it’s helpful to think of f/stops this way: The higher the f-number, the greater the depth of field. Though they’re not exactly the same thing, photographers usually use f-stop and aperture interchangeably.
  • Regardless of its current f-stop setting, a DSLR camera maximizes the light in its viewfinder by always showing you the scene at the lens’s widest aperture. All this extra light makes it easier to compose and focus, but unless your exposure is set for the widest aperture (which it shouldn’t be unless you have a very specific reason to limit your depth of field or maximize light), the image you capture will have more DOF than you see in the viewfinder. The consequence is that you usually can’t see how much of your scene is in focus when you compose. Most cameras have a DOF preview button that temporarily closes the lens down to the f-stop you have set—this shows the scene at its actual DOF, but also darkens the viewfinder considerably (depending on how small your aperture is), making it far more difficult to see the scene.
  • For any focus point, there’s only one (infinitely thin) plane of perfect sharpness, regardless of the focal length and f-stop—everything in front of and behind the plane containing your focus point (and parallel to the sensor) will be some degree of less than maximum sharpness. As long as the zone of less than perfect sharpness isn’t perceptible, it’s considered “acceptably sharp.” When that zone becomes visible, that portion of the image is officially “soft.” Acceptable sharpness varies with the display size and viewing distance.
  • The zone of acceptable sharpness extends a greater distance beyond the focus point than it does in front of the focus point. If you focus on that rock ten feet in front of you, rocks three feet in front of you may be out of focus, but a tree fifty feet away could be sharp. I’ll explain more about this later.
  • While shorter focal lengths may appear to provide more depth of field, believe it or not, DOF doesn’t actually change with focal length. What does change is the size of everything in the image, so as your focal length increases, your apparent DOF decreases. So you really aren’t gaining more absolute DOF with a shorter focal length, the softness just won’t be as visible. When photographers talk about DOF, they’re virtually always talking about apparent DOF—the way the image looks. (That’s the DOF definition I use here too.)
  • The closer your focus point, the narrower your DOF (range of front-to-back sharpness). If you focus your 24mm lens on a butterfly sunning on a poppy six inches from your lens, your DOF is so narrow that it’s possible parts of the poppy will be out of focus; if you focus the same lens on a tree 100 feet away, the mountains behind the tree are sharp too.
Whitney Arch Moonset, Alabama Hills, California

Moonset, Mt. Whitney and Whitney Arch, Alabama Hills, California
With subjects throughout my frame, from close foreground to distant background, it’s impossible to get everything perfectly sharp. Here in the Alabama Hills near Lone Pine, California, I stopped down to f/16 and focused at the at the most distant part of the arch. This ensured that all of the arch would be perfectly sharp, while keeping Mt. Whitney and the rest of the background “sharp enough.”

Defining sharpness

Depth of field discussions are complicated by the fact that “sharp” is a moving target that varies with display size and viewing distance. But it’s safe to say that all things equal, the larger your ultimate output and closer the intended viewing distance, the more detail your original capture should contain.

To capture detail a lens focuses light on the sensor’s photosites. Remember using a magnifying glass to focus sunlight and ignite a leaf when you were a kid? The smaller (more concentrated) the point of sunlight, the sooner the smoke appeared. In a camera, the finer (smaller) a lens focuses light on each photosite, the more detail the image will contain at that location. So when we focus we’re trying to make the light striking each photosite as concentrated as possible.

In photography we call that small circle of light your lens makes for each photosite its “circle of confusion.” The larger the CoC, the less concentrated the light and the more blurred the image will appear. Of course if the CoC is too small to be seen as soft, either because the print is too small or the viewer is too far away, it really doesn’t matter. In other words, areas of an image with a large CoC (relatively soft) can still appear sharp if small enough or viewed from far enough away. That’s why sharpness can never be an absolute term, and we talk instead about acceptable sharpness that’s based on print size and viewing distance. It’s actually possible for the same image to be sharp for one use, but too soft for another.

So how much detail do you need? The threshold for acceptable sharpness is pretty low for an image that just ends up on an iPhone or an 8×10 calendar on the kitchen wall, but if you want that image to fill the wall above the sofa, acceptable sharpness requires much more detail. And as your print size increases (and/or viewing distance decreases), the CoC that delivers acceptable sharpness shrinks correspondingly.

Many factors determine the a camera’s ability to record detail. Sensor resolution of course—the more resolution your sensor has, the more important it becomes that to have a lens that can take advantage of that extra resolution. And the more detail you want to capture with that high resolution sensor and tack-sharp lens, the more important your depth of field and focus point decisions become.

Hyperfocal focus

The foundation of a sound approach to maximizing sharpness for a given viewing distance and image size is hyperfocal focusing, an approach that uses viewing distance, f-stop, focal length, and focus point to ensure acceptable sharpness.

The hyperfocal point is the focus point that provides the maximum depth of field for a given combination of sensor size, f/stop, and focal length. Another way to express it is that the hyperfocal point is the closest you can focus and still be acceptably sharp to infinity. When focused at the hyperfocal point, your scene will be acceptably sharp from halfway between your lens and focus point all the way to infinity. For example, if the hyperfocal point for your sensor (full frame, APS-C, 4/3, or whatever), focal length, and f-stop combinition is twelve feet away, focusing there will give you acceptable sharpness from six feet (half of twelve) to infinity—focusing closer will soften the distant scene; focusing farther will keep you sharp to infinity but extend the area of foreground softness.

Because the hyperfocal variable (sensor size, focal length, f-stop) combinations are too numerous to memorize, we usually refer to an external aid. That used to be awkward printed tables with long columns and rows displayed in microscopic print, the more precise the data, the smaller the print. Fortunately, those have been replaced by smartphone apps with more precise information in a much more accessible and readable form. We plug in all the variables and out pops the hyperfocal point distance and other useful information

It usually goes something like this:

  1. Identify the composition
  2. Determine the closest thing that must be sharp (right now I’m assuming you want sharpness to infinity)
  3. Dig the smartphone from one of the 10,000 pockets it could be in
  4. Open the hyperfocal app and plug in the sensor size (usually previously set by you as the default), f-stop, and a focus distance
  5. Up pops the hyperfocal distance (and usually other info of varying value)

You’re not as sharp as you think

Since people’s eyes start to glaze over when CoC comes up, they tend to use the default returned by the smartphone app. But just because the app tells you you’ve nailed focus, don’t assume that your work is done. An often overlooked aspect of hyperfocal focusing is that app makes assumptions that aren’t necessarily right, and in fact are probably wrong.

The CoC your app uses to determine acceptable sharpness is a function of sensor size, display size, and viewing distance. But most app’s hyperfocal tables assume that you’re creating an 8×10 print that will be viewed from a foot away—maybe valid 40 years ago, but not in this day of mega-prints. The result is a CoC three times larger than the eye’s ability to resolve.

That doesn’t invalidate hyperfocal focusing, but if you use published hyperfocal data from an app or table, your images’ DOF might not be as ideal as you think it is for your use. If you can’t specify a smaller CoC in your app, I suggest that you stop-down a stop or so more than the app/table indicates. On the other hand, stopping down to increase sharpness is an effort of diminishing returns, because diffraction increases as the aperture shrinks and eventually will soften the entire image—I try not to go more than a stop smaller than my data suggests.

Keeping it simple

As helpful as a hyperfocal app can be, whipping out a smartphone for instant in-the-field access to data is not really conducive to the creative process. I’m a big advocate of keeping photography as simple as possible, so while I’m a hyperfocal focus advocate in spirit, I don’t usually use hyperfocal data in the field. Instead I apply hyperfocal principles in the field whenever I think the margin of error gives me sufficient wiggle room.

Though I don’t often use the specific hyperfocal data in the field, I find it helps a lot to refer to hyperfocal tables when I’m sitting around with nothing to do. So if I find myself standing in line at the DMV, or sitting in a theater waiting for a movie (I’m a great date), I open my iPhone hyperfocal app and plug in random values just to get a sense of the DOF for a given f-stop and focal length combination. I may not remember the exact numbers later, but enough of the information sinks in that I accumulate a general sense of the hyperfocal DOF/camera-setting relationships.

Finally, something to do

Unless I think I have very little DOF margin for error in my composition, I rarely open my hyperfocal app in the field. Instead, once my composition is worked out and have determined the closest object I want sharp—the closest object with visual interest (shape, color, texture), regardless of whether it’s a primary subject.

  • If I want to be sharp to infinity and my closest foreground object (that needs to be sharp) is close enough to hit by tossing my hat, I need a fair amount of DOF. If my focal length is pretty wide, I might skip the hyperfocal app, stop down to f/16, and focus a little behind my foreground object. But if I’m at a fairly long focal length, or my closest object is within arm’s reach, I have very little margin for error and will almost certainly refer to my hyperfocal app.
  • If I could hit my foreground object with a baseball and my focal length is 50mm (or so) or less, I’ll probably go with f/11 and just focus on my foreground object. But as my focal length increases, so does the likelihood that I’ll need to refer to my hyperfocal app.
  • If it would take a gun to reach my closest object (picture a distant peak), I choose an f-stop between f/8 and f/11 and focus anywhere in the distance.

Of course these distances are very subjective and will vary with your focal length and composition (not to mention the strength of your pitching arm), but you get the idea. If you find yourself in a small margin for error focus situation without a hyperfocal app (or you just don’t want to take the time to use one), the single most important thing to remember is to focus behind your closest subject. Because you always have sharpness in front of your focus point, focusing on the closest subject gives you unnecessary near sharpness at the expense of distant sharpness. By focusing a little behind your closest subject, you’re increasing the depth of your distant sharpness while (if you’re careful) keeping your foreground subject within the zone of sharpness in front of the focus point.

And finally, foreground softness, no matter how slight, is almost always a greater distraction than slight background softness. So, if it’s impossible to get all of your frame sharp, it’s usually best to ensure that the foreground is sharp.

Some examples

Sunset Palette, Half Dome from Sentinel Dome, Yosemite

A hat’s toss away: The closest pool was about 6 feet from my lens. I stopped down to f/20 (smaller than I generally like to go) and focused on the back of the pool on the left, about 10 feet away.

A baseball throw away: The little clump of wildflowers (lower right) was about 35 feet away and the trees started another 35 feet beyond that. With a focal length of 55mm, I dialed to f/11 and focused on the most distant foreground tree, getting everything from the flowers to Half Dome sharp.

Gary Hart Photography: Tree and Crescent, Sierra Foothills, California

Honey, fetch my rifle: With everything here at infinity I knew could focus on the trees or moon confident that the entire frame would be sharp. In this case I opted for f/8 to minimize diffraction but still in my lens’s sharpest f-stop range, and focused on the tree.

Why not just automatically set the aperture to f/22 and be done with it? I thought you’d never ask. Without delving too far into the physics of light and optics, let’s just say that there’s a not so little light-bending problem called “diffraction” that robs your images of sharpness as your aperture shrinks—the smaller the aperture, the greater the diffraction. Then why not choose f/2.8 when everything’s at infinity? Because lenses tend to lose sharpness at their aperture extremes, and are generally sharper in their mid-range f-stops. So while diffraction and lens softness don’t sway me from choosing the f-stop that gives the DOF I want, I try to never choose an aperture bigger or smaller than I need.

Now that we’ve let the composition determine our f-stop, it’s (finally) time to actually choose the focus point. Believe it or not, with this foundation of understanding we just established, focus becomes pretty simple. Whenever possible, I try to have elements throughout my frame, often starting near my feet and extending far into the distance. When that’s the case I stop down focus on an object slightly behind my closest subject (the more distant my closest subject, the farther behind it I can focus).

When I’m not sure, or if I don’t think I can get the entire scene sharp, I err on the side of closer focus to ensure that the foreground is sharp. Sometimes before shooting I check my DOF with the DOF preview button, allowing time for my eye to adjust to the limited light. And when maximum DOF is essential and I know my margin for error is small, I don’t hesitate to refer to the DOF app on my iPhone.

A great thing about digital capture is the instant validation of the LCD—when I’m not sure, or when getting it perfect is absolutely essential, after capture I pop my image up on the LCD, magnify it to maximum, check the point or points that must be sharp, and adjust if necessary. Using this immediate feedback to make instant corrections really speeds the learning process.

Sometimes less is more

The depth of field you choose is your creative choice, and no law says you must maximize it. Use your camera’s limited depth of field to minimize or eliminate distractions, create a blur of background color, or simply to guide your viewer’s eye. Focusing on a near subject while letting the background go soft clearly communicates the primary subject while retaining enough background detail to establish context. And an extremely narrow depth of field can turn distant flowers or sky into a colorful canvas for your subject.

In this image of a dogwood blossom in the rain, I positioned my camera to align Bridalveil Fall with the dogwood and used an extension tube to focus extremely close. The narrow depth of field caused by focusing so close turned Bridalveil Fall into a background blur (I used f/18 to the fall a little more recognizable), allowing viewers to feast their eyes on the dogwood’s and raindrop’s exquisite detail.
An extension tube on a macro lens at f/2.8 gave me depth of field measured in fractions of an inch. The gold color in the background is more poppies, but they’re far enough away that they blur into nothing but color. The extremely narrow depth of field also eliminated weeds and rocks that would have otherwise been a distraction.

There’s no substitute for experience

No two photographers do everything exactly alike. Determining the DOF a composition requires, the f-stop and focal length that achieves the desired DOF, and where to place the point of maximum focus, are all part of the creative process that should never be left up to the camera. The sooner you grasp the underlying principles of DOF and focus, the sooner you’ll feel comfortable taking control and conveying your own unique vision.

About this image

Gary Hart Photography: Floating Leaves, Valley View, Yosemite

Floating Autumn Leaves, Valley View, Yosemite

Yosemite may not be New England, but it can still put on a pretty good fall color display. A few years ago I arrived  at Valley View on the west side of Yosemite Valley just about the time the fall color was peaking. I found the Merced River filled with reflections of El Capitan and Cathedral Rocks, framed by an accumulation of recently fallen leaves still rich with vivid fall color.

To emphasize the colorful foreground, I dropped my tripod low and framed up a vertical composition. I knew my hyperfocal distance at 24mm and f/11 would be 5 or 6 feet, but with the scene ranging from the closest leaves at about 3 feet away out to El Capitan at infinity, I also knew I’d need to be careful with my focus choices. For a little more margin for error I stopped down to f/16, then focused on the nearest rocks which were a little less than 6 feet away. As I usually do when I don’t have a lot of focus wiggle room, I magnified the resulting image on my LCD and moved the view from the foreground to the background to verify front-to-back sharpness.

Workshop Schedule || Purchase Prints


Playing with Depth: A Gallery of Focus

Click an image for a closer look and slide show. Refresh the screen to reorder the display.

Chasing Rainbows

Gary Hart Photography: Heaven Sent, Grand Canyon Rainbow

Heaven Sent, Grand Canyon Rainbow

The annual Grand Canyon monsoon is known for its spectacular electrical storms, but let’s not forget the rainbows that often punctuate these storms. A rainbow requires rain, sunlight, and the right viewing angle—given the ephemeral nature of a monsoon thunderstorm, it’s usually safe to assume that the sun probably isn’t far behind. To experience a rainbow after a Grand Canyon monsoon storm, all it takes is some basic knowledge, a little faith, and some good fortune.

To help with the knowledge part, I’m sharing the how-and-why of rainbows, excerpted from my just updated Rainbow article in my Photo Tips section. For the faith and good fortune part, read “The story of this image” at the bottom of this post.

Rainbows Demystified

Most people understand that a rainbow is light spread into various colors by airborne water drops. Though a rainbow can seem like a random, unpredictable phenomenon, the natural laws governing rainbow are actually quite specific and predictable, and understanding these laws can help photographers anticipate a rainbow and enhance its capture.

Let there be light

Energy generated by the sun bathes Earth in continuous electromagnetic radiation, its wavelengths ranging from extremely short to extremely long (and every wavelength in between). Among the broad spectrum of electromagnetic solar energy we receive are ultra-violet rays that burn our skin, infrared waves that warm our atmosphere, and a very narrow range of wavelengths the human eye sees.

These visible wavelengths are captured by our eyes  and interpreted by our brain. When our eyes take in light comprised of the full range of visible wavelengths, we perceive it as white (colorless) light. Color registers when some wavelengths are more prevalent than others. For example, when light strikes an opaque (solid) object such as a tree or rock, some of its wavelengths are absorbed; the wavelengths not absorbed are scattered (reflected). Our eyes capture this scattered light, send the information to our brains, which interprets it as a color. When light strikes water, some is absorbed, some passes through to reveal the submerged world, and some light is reflected by the surface as a reflection.

Light traveling from one medium to another (e.g., from air into water) refracts (bends). Different wavelengths refract different amounts, causing the light to split into its component colors.

To understand the interaction of water and light that creates a rainbow, it’s simplest to visualize what happens when sunlight strikes a single drop. Light entering a water drop refracts (bends), with different wavelengths refracting different amounts, which separates the originally homogeneous white light into the myriad colors of the spectrum.

But simply separating the light into its component colors isn’t enough to create a rainbow—if it were, we’d see a rainbow whenever light strikes water. Seeing the rainbow spectrum caused by refracted light requires that the refracted light be returned to our eyes somehow.

A raindrop isn’t flat like a sheet of paper, it’s spherical, like a ball. Light that was refracted (and separated into multiple colors) as it entered the front of the raindrop, continues through to the back of the raindrop, where some is reflected. Red light reflects back at about 42 degrees, violet light reflects back at about 40 degrees, and the other spectral colors reflect back between 42 and 40 degrees. What we perceive as a rainbow is this reflection of the refracted light—notice how the top color of the primary rainbow is always red, the longest visible wavelength; the bottom color is always violet, the shortest visible wavelength.

Follow your shadow

Every raindrop struck by sunlight creates a rainbow. But just as the reflection of a mountain peak on the surface of a lake is visible only when viewed from the angle the reflection bounces off the lake’s surface, a rainbow is visible only when you’re aligned with the 40-42 degree angle at which the raindrop reflects the spectrum of rainbow colors.

Fortunately, viewing a rainbow requires no knowledge of advanced geometry. To locate or anticipate a rainbow, picture an imaginary straight line originating at the sun, entering the back of your head, exiting between your eyes, and continuing down into the landscape in front of you—this line points to the “anti-solar point,” an imaginary point exactly opposite the sun. With no interference, a rainbow would form a complete circle, skewed 42 degrees from the line connecting the sun and the anti-solar point—with you at the center. (We don’t see the entire circle because the horizon usually gets in the way.)

Because the anti-solar point is always at the center of the rainbow’s arc, a rainbow will always appear exactly opposite the sun (the sun will always be at your back). It helps to remember that your shadow always points toward the anti-solar point. So when you find yourself in direct sunlight and rain, locating a rainbow is as simple as following your shadow and looking skyward—if there’s no rainbow, the sun’s probably too high.

High or low

Sometimes a rainbow appears as a majestic half-circle, arcing high above the distant terrain; other times it’s merely a small circle segment hugging the horizon. As with the direction of the rainbow, there’s nothing mysterious about its varying height. Remember, every rainbow would form a full circle if the horizon didn’t get in the way, so the amount of the rainbow’s circle you see (and therefore its height) depends on where the rainbow’s arc intersects the horizon.

While the center of the rainbow is always in the direction of the anti-solar point, the height of the rainbow is determined by the height of the anti-solar point, which will always be exactly the same number of degrees below the horizon as the sun is above the horizon. It helps to imagine the line connecting the sun and the anti-solar point as a fulcrum, with you as the pivot—picture yourself in the center of a teeter-totter: as one seat rises above you, the other drops below you. That means the lower the sun, the more of its circle you see and the higher it appears above the horizon; conversely, the higher the sun, the less of its circle is above the horizon and the flatter (and lower) the rainbow will appear.

Assuming a flat, unobstructed scene (such as the ocean), when the sun is on the horizon, so is the anti-solar point (in the opposite direction), and half of the rainbow’s 360 degree circumference will be visible. But as the sun rises, the anti-solar point drops—when the sun is more than 42 degrees above the horizon, the anti-solar point is more than 42 degrees below the horizon, and the only way you’ll see a rainbow is from a perspective above the surrounding landscape (such as on a mountaintop or on a canyon rim).

Of course landscapes are rarely flat. Viewing a scene from above, such as from atop Mauna Kea or from the rim of the Grand Canyon, can reveal more than half of the rainbow’s circle. From an airplane, with the sun directly overhead, all of the rainbow’s circle can be seen, with the plane’s shadow in the middle.

Double Your pleasure

Not all of the light careening about a raindrop goes into forming the primary rainbow. Some of the light slips out the back of the raindrop to illuminate the sky, and some is reflected inside the raindrop a second time. The refracted light that reflects a second time before exiting creates a secondary, fainter rainbow skewed 50 degrees from the anti-solar point. Since this is a reflection, the colors of the secondary rainbow are reversed from the primary rainbow.

And if the sky between the primary and secondary rainbows appears darker than the surrounding sky, you’ve found “Alexander’s band.” It’s caused by all the light machinations I just described—instead of all the sunlight simply passing through the raindrops to illuminate the sky, some of the light was intercepted, refracted, and reflected by the raindrops to form our two rainbows, leaving less light for the sky between the rainbows.

Waterfalls are easy

From Yosemite’s Tunnel View each spring afternoon, a rainbow can be viewed at the base of Bridalveil Fall. As the sun drops, the rainbow climbs, taking about 30 minutes to complete its ascent.

Understanding the optics of a rainbow has practical applications for photographers. Not only does it help you anticipate a rainbow before it happens, it also enables you to find rainbows in waterfalls.

Unlike a rainbow caused by rain, which requires you to be in exactly the right position to capture the incongruous convergence of rainfall and sunshine, a waterfall rainbow can be predicted with clock-like precision—just add sunshine.

Yosemite is my location of choice, but there’s probably a waterfall or two near you that will deliver. Just figure out when the waterfall gets direct sunlight early or late in the day, then put yourself somewhere on the line connecting the sun and the waterfall. And if you have an elevated vantage point, you’ll find that the sun doesn’t even need to be that low in the sky.

Moonbows

Understanding rainbow optics can even help you locate rainbows that aren’t even visible to the naked eye. A “moonbow” (lunar rainbow) is a rarely witnessed and wonderful phenomenon that follows all the natural rules of a daylight rainbow. But instead of resulting from direct sunlight, a moonbow is caused by sunlight reflected by the moon.

Moonlight isn’t bright enough to fully engage the cones in your eyes that reveal color, though in bright moonlight you can see the moonbow as an arcing monochrome band. But a camera on a sturdy tripod can use its virtually unlimited shutter duration to accumulate enough light to bring out a moonbow in full living color. Armed with this knowledge, all you need to do is put yourself in the right location at the right time.

Moonbow and Big Dipper, Lower Yosemite Fall, Yosemite :: Each spring the full moon and Yosemite Falls conspire to deliver a breathtaking moonbow display. And as if that’s not enough, the Big Dipper is suspended above as if it’s the source of Yosemite Falls.

Rainbow, Lipan Point, Grand Canyon

Rainbow, Lipan Point, Grand Canyon  :: Sometimes the rainbow doesn’t appear exactly where you want it to. In a perfect world this rainbow would have connected the rims of the Grand Canyon, but there was no vantage point on the rim that gave me that view. Nevertheless, I was able to use the canyon’s red rock as a foreground, and balance its exquisite depth with the rainbow.

The story of this image

Gary Hart Photography: Heaven Sent, Monsoon Rainbow, Vista Encantada, Grand Canyon North Rim

Heaven Sent, Monsoon Rainbow, Vista Encantada, Grand Canyon North Rim

Following a nice sunrise at the always beautiful Point Imperial, the Grand Canyon Monsoon photo workshop  group spent two hours near Bright Angel Point photographing a spectacular electrical storm that delivered multiple lightning captures to everyone in the group. When the storm moved too close and drove us to safety (we’re resilient and adventuresome, not stupid), it would have been easy call it a day and tally our bounty. I mean, who likes getting rained on? Photographers, that’s who.

Don Smith and I herded our group into the cars and headed to Cape Royal Road, where we could follow the Grand Canyon’s East Rim above Marble Canyon all the way to Cape Royal. Knowing that monsoon showers are fairly localized, the plan was to drive out of the cell that was dumping on us at the lodge and either shoot back at it, or (more likely) find another cell firing out over the canyon. In the back of my mind though was the hope for a rainbow above the canyon—dropping in the west, the sun was perfectly positioned for rainbows in the east.

The rainbow appeared just after we passed the Point Imperial Road junction, arcing high above the forest. Climbing through the trees toward the rim (and its views of Marble Canyon), my urgency intensified with the rainbow’s vivid color, but we were stuck behind a meandering tourist who clearly had different priorities. As tempted as I was to pass him, I knew that would be a mistake with three more cars following me. So we poked along at a glacial pace. After what felt like hours, screeched to a halt at the Vista Encantada parking area with the rainbow hanging in there—I swear everyone was out of the car and scrambling for their gear before I came to a complete stop.

With a full rainbow above an expansive view, I opted for my Sony 12-24 lens on my a7RII, but immediately began to question that choice. While Vista Encantada offers a very pretty view, it’s not my favorite scene to photograph because of the less-than-photogenic shrubbery in the foreground—a telephoto lens definitely would have worked better to eliminate the foreground, but I wanted more rainbow. So after a few failed attempts to find a composition at the conventional vista, I sprinted into the woods to find something better. This turned out to be a wise choice, as the shrubs here were replaced with (much more photogenic) mature evergreens.

In a perfect world I’d have found an unobstructed view into the Grand Canyon, but as photographers know, the world is rarely perfect. Committed to my wide lens, I decided to use the nearby evergreens as my foreground, moving back just far enough for the rainbow to clear their crowns. Composing wide enough to include the trees top-to-bottom also allowed me to include all of the rainbow—suddenly my 12-24 lens choice was genius!

After finishing at Vista Encantada we continued down the road and photographed another rainbow from Roosevelt Point, then wrapped up the day with a sunset for the ages at Cape Royal. A great day indeed, all thanks to monsoon weather that would have kept most tourists indoors.

Join Me in a Grand Canyon Photo Workshop

Workshop Schedule || Purchase Prints


A Gallery of Rainbows

Click an image for a closer look and to view slide show.

 

The Shocking Truth About Lightning

Gary Hart Photography: Forked Lightning, Point Imperial, Grand Canyon

Forked Lightning, Point Imperial, Grand Canyon
Sony a7RIII
Sony 100-400 GM
Lightning Trigger LT-IV
ISO 400
f/7.1
.4 seconds

Every year for the last 10 (or so) years I’ve traveled to the Grand Canyon during the Southwest summer monsoon to photograph lightning. Not only have I captured hundreds of lightning strikes and lived to tell about it (yay), I’ve learned a lot. A couple of years ago I added an article sharing my insights on photographing lightning to my photo tips section. With lightning season upon (or almost upon) us here in the United States, I’ve updated my article with new images and additional info. You can still find the article (with updates) in my Photo Tips section, but I’m re-posting it here in my regular blog feed as well.

Read the story of this image at the bottom of this post, just above the gallery of lightning images.


How to Photograph Daylight Lightning Without Getting Killed (Probably)

Let’s start with the given that lightning is dangerous, and if “safety first” is a criterion for intelligence, photographers are stupid. So combining photographers and lightning is a recipe for disaster.

Okay, seriously, because lightning is both dangerous and unpredictable, before attempting anything that requires you to be outside during an electrical storm, it behooves you to do your homework. And the more you understand lightning, how to avoid it and stay safe in its presence, the greater your odds of living to take more pictures. Not only will understanding lightning improve your safety, a healthy respect for lightning’s fickle power will also help you anticipate and photograph lightning.

Lightning enlightenment

Lightning is an electrostatic discharge that equalizes the negative/positive polarization between two objects. In fact, when you get shocked touching a doorknob, you’ve been struck by lightning. The cause of polarization during electrical storms isn’t completely understood, but it’s generally accepted that the extreme vertical convective air motion (convection is up/down circular flow caused when less-dense warm air rises, becomes more dense as it cools with elevation, and ultimately becomes cool/dense enough to fall. Convection is also what causes bubbling in boiling water. Convection in a thunderstorm carries positively charged molecules upward and negatively charged molecules downward. Because opposite charges attract each other, the extreme polarization (positive charge at the top of the cloud, negative charge near the ground) is quickly (and violently) equalized: lightning.

With lightning comes thunder, the sound of air expanding explosively when heated by a 50,000 degree jolt of electricy. The visual component of the lightning bolt that caused the thunder travels to you at the speed of light, over 186,000 miles per second (virtually instantaneous regardless of your distance on Earth). But lightning’s aural component, thunder, only travels at the speed of sound, a little more than 750 miles per hour—a million times slower than light. Knowing that the thunder occurred at the same time as the lightning flash, and how fast both travel, we can compute the approximate distance of the lightning strike. At 750 miles per hour, thunder will travel about a mile in about five seconds: Dividing the time between the lightning’s flash and the thunder’s crash by five gives you the lightning’s distance in miles; divide the interval by three for the distance in kilometers. If five seconds pass between the lightning and the thunder, the lightning struck about one mile away; fifteen seconds elapsed means it’s about three miles away.

Lightning safety

The 30 (or so) people killed by lightning in the United States each year had one thing in common with the rest of us: they didn’t believe they’d be struck by lightning when they started whatever it was they were doing when they were struck. The only sure way to be safe in an electrical storm is to be in a fully enclosed structure or metal-framed vehicle, away from open windows, plumbing, wiring, and electronics.

While there’s no completely safe way to photograph lightning, it doesn’t hurt to improve your odds of surviving to enjoy the fruits of your labor.  (Unfortunately, photographing lightning usually requires being outside.) Most lightning strikes within a six mile radius of the previous strike. So, if less than thirty seconds elapses between the flash and bang, you’re too close. And since “most” doesn’t mean “all,” it’s even better to allow a little margin for error. Thunder isn’t usually audible beyond ten miles—if you can hear the thunder, it’s safe to assume that you’re in lightning range.

But if you absolutely, positively must be outside with the lightning crashing about you, or you simply find yourself caught outside with no available shelter, there are few things you can do to reduce the chance you’ll be struck:

  • Avoid water
  • Avoid high ground
  • Avoid exposed areas
  • Avoid metal or electronic objects
  • Avoid tall objects such as trees and open structures (and tripods)
  • Stay at least fifteen feet from other people
  • Do not lie down
  • If you’re surrounded by trees, position yourself near shorter trees, as far from trunks as possible
  • Crouch with your feet together and your hands covering your ears
  • A lightning strike is often preceded by static electricity that makes your hair stand on end and an ozone smell (best described as the smell of electricity—I think of bumper cars at the amusement park, or the smell of my electric slot cars when I was a kid)—if your hair starts to stand up and/or you notice a distinct odor that could be ozone, follow as many of the above steps as you can, as quickly as possible (often you’ll only have time to crouch)
Three Strikes, Bright Angel Point, North Rim, Grand CanyonThree Strikes, Bright Angel Point, North Rim, Grand Canyon

Lightning How-to

Photographing lightning at night is mostly a matter of pointing your camera in the right direction with a multi-second shutter speed and hoping the lightning fires while your shutter’s open—pretty straightforward. Photographing daylight lightning is a little more problematic. It’s usually over before you can react, so without a lightning sensor to recognize lightning and click your shutter, success is largely dumb luck (few people are quick enough see it and click). And using a neutral density filter to stretch the exposure time out to 20 or 30 seconds sounds great in theory, but a lightning bolt with a life measured in milliseconds, captured in an exposure measured in multiple seconds, will almost certainly lack the contrast necessary to be be even slightly visible.

Lightning Trigger: The best tool for the job

Most lightning sensors (all?) attach to your camera’s hot shoe and connect via a special cable to the camera’s remote-release port. When engaged, the sensor fires the shutter (virtually) immediately upon detecting lightning, whether or not the lightning is visible to the eye or camera. With many lightning sensors from which to choose, before I bought my first one I did lots of research. I ended up choosing the sensor that was the consensus choice among photographers I know and trust: Lightning Trigger from Stepping Stone Products in Dolores, CO. At around $350 (including the cable), the Lightning Trigger is not the cheapest option, but after many leading lightning-oriented photo workshops, I can say with lots of confidence that lightning sensors are not generic products, and the internal technology matters a lot. Base on my own results and observations, the Lightning Trigger is the only one I’d use and recommend (I get no kickback for this). On the other hand, if you already have a lightning sensor you’re happy with, there’s no reason to switch.

I won’t get into lots of specifics about how to set up the Lightning Trigger because it’s simple and covered fairly well in the included documentation. But you should know that of the things that sets the Lightning Trigger apart from many others is its ability to put your camera in the “shutter half pressed” mode, which greatly reduces shutter lag (see below). But that also means that connecting the Trigger will probably disable your LCD replay, so you won’t be able to review your captures without disconnecting—a simple but sometimes inconvenient task. You also probably won’t be able to adjust your exposure with the Lightning Trigger connected.

The Lightning Trigger documentation promises at least a 20 mile range, and after many years using mine at the Grand Canyon, I’ve seen nothing that causes me to question that. It also says you can expect the sensor to fire at lightning that’s not necessarily in front of you, or lightning you can’t see at all, which I will definitely confirm. For every click with lightning in my camera’s field of view, I get many clicks caused by lightning I didn’t see, or that were outside my camera’s field of view. But when visible lightning does fire somewhere in my composition, I estimate that the Lightning Trigger clicked the shutter at least 95 percent of the time (that is, even though I got lots of false positives, the Lightning Trigger missed very few bolts it should have detected). Of these successful clicks, I actually captured lightning in at least 2/3 of the frames.

The misses are a function of the timing between lightning and camera—sometimes the lightning is just too fast for the camera’s shutter lag. In general, the more violent the storm, the greater the likelihood of bolts of longer duration, and multiple strokes that are easier to capture. And my success rate has increased significantly beyond 2/3 since switching from a Canon 5DIII to Sony mirrorless (more on this in the Shutter Lag section).

The Lightning Trigger documentation recommends shutter speeds between 1/4 and 1/20 second—shutter speeds faster than 1/20 second risk completing the exposure before all of the secondary strokes fire; slower shutter speeds tend to wash out the lightning. To achieve daylight shutter speeds between 1/4 and 1/20 second, I use a polarizer, with my camera at ISO 50 and aperture at f/16 (and sometimes smaller). Of course exposure values will vary with the amount of light available, and you may not need such extreme settings when shooting into an extremely dark sky. The two stops of light lost to a polarizer helps a lot, and 4- or 6-stop neutral density filter is even better with fairly bright skies (but if you’re using a neutral density filter, try to avoid shutter speeds longer than 1/4 second).

Shutter lag

Lightning is fast, really, really fast, so the faster your camera’s shutter responds after getting the command from the trigger device, the more success you’ll have. The delay between the click instruction (whether from your finger pressing the shutter button, a remote release, or a lightning sensor) and the shutter firing is called “shutter lag.”

The less shutter lag you have, the better your results will be. The two most important shutter lag factors are:

  • Camera model: It’s surprising how much shutter lag can vary from manufacturer to manufacturer and model to model. In a perfect world, for lightning photography your camera’s shutter lag will be 60 milliseconds (.006 seconds) or faster (the lower the number the better), but 120 milliseconds (.012 seconds) or faster can give you some success. The top cameras from Sony, Nikon, and Canon are all fast enough, but the latest Sonys are the definite shutter lag winner (fastest), with Nikon a not too distant second, and Canon third. And shutter lag can vary with the manufacturer’s model: While my Sony a7RII is one of the fastest cameras out there, my a7R was unusably slow, so you need to check your model. Since I don’t check every camera released, it’s possible this ranking will change well before I update this article, so I recommend that you research shutter lag for your camera model. Unfortunately, shutter lag isn’t usually in the manufacturers specifications, so it’s hard to find. The best source I’ve found is the “Pre-focused” time in the Performance tab of the camera reviews at Imaging Resource.
  • Camera settings: Basically, to minimize the “thinking” the camera needs to before firing, you want to be in manual everything mode—metering and focus. If your camera offers an electronic front curtain option (as my Sonys do), use it. If you must autofocus, go ahead and do it each time you recompose, then turn autofocus off as soon as you’re focused. Though the Lightning Trigger documentation suggests Aperture Priority metering, I use and recommend Manual metering mode to eliminate any camera-slowing metering (but Aperture Priority is fine if you have a strong preference). And, also despite what the Lightning Trigger documentation suggests, noise reduction is a post-capture function that might slightly delay continuous frames, but it won’t increase shutter lag.

This slideshow requires JavaScript.

Other equipment

In addition to a lightning sensor and fast camera, you’ll need:

  • A solid tripod and head: Don’t even think about trying to photograph lightning hand-held
  • Rain gear that keeps you dry from head-to-toe
  • Umbrella (a.k.a., Wile E. Coyote Lightning Rod) to shield your camera and lightning sensor (many sensors, including the Lightning Trigger, aren’t waterproof) while you compose and wait in the rain. The umbrella is for when you’re photographing storm cells at a great distance, such as on the rim of the Grand Canyon and the lighting is across the canyon. Obviously, when the lightning gets within 10 miles, put the umbrella down and run for cover.)
  • Lens hood to shield some of the raindrops that could mar the front element of your lenses
  • Neutral density filter and/or polarizer to slow shutter speed into the ideal range (1/4 – 1/20 second)
  • A garbage bag (my choice) or rainproof camera jacket (haven’t found one) to keep your camera and sensor dry during a downpour
  • Extra lightning sensor batteries (better safe than sorry)
  • Extra memory cards: When a storm is very close or active, your lightning sensor could detect 20 or 30 strikes per minute (even when little or no lightning is visible to the eye)
  • Infrared remote to test your Lightning Trigger; I sometimes borrow the remote from my hotel room, but the Apple TV remote works great and is extremely compact (fits nicely into the Lightning Trigger pouch)
  • A towel 

Getting the shot

Lightning is most likely to strike in or near the gray curtains (clearly recognizable as distant rain) that hang beneath dark clouds. In addition to visible rain curtains, the darkest and tallest clouds are usually the most likely to fire lightning. Here are a few more points to consider:

  • The wider your composition, the greater your odds of capturing lightning, but the smaller the lightning will appear in your image.
  • Identify the most likely lightning cell and find the best composition that includes it. I tend to start with wider compositions to ensure success, then tighten my composition once I’m fairly confident I captured something.
  • Note the height from which the lightning originates and be sure to include enough cloud to get all of the stroke. On the other hand, don’t include too much room above the lightning—the most frequent rookie mistake I see is too much sky/clouds in the frame. The second most frequent is lightning cut off at the top. Unless the storm is too close for safety, for any given cell, most lightning will originate from about the same height above the ground.
  • The best lens is usually a midrange zoom such as a 24-70 or 24-105—if you find yourself reaching for the 16-35 (or wider), you’re too close.
  • On the other hand, once you’re sure you’ve captured some good strikes, try putting on a 70-200. The narrow field of view can significantly reduce the number of frames with lightning, but the ones you get will be much larger in the frame and therefore more spectacular.
  • Don’t forget to try some vertical compositions. I usually wait until after I know I’ve captured some in a horizontal frame because vertical narrows the horizontal field of view and lowers the odds of success a little.
  • Lightning stands out better in a slightly underexposed image. My target shutter speed is usually 1/8 second (slow enough to include multiple pulses, but not so slow that I risk washing out the lightning). When the sky is relatively bright, dropping to 1/15 or even 1/20 second can make the lightning stand out better than 1/8 (but risks losing secondary strikes). Conversely, when the sky is extremely dark and the lightning is firing like crazy, extending to 1/4 second might increase your chances for multiple pulses.
  • Just because you’re standing around waiting for things to happen, doesn’t mean there’s nothing to do. Keep your eyes glued to the sky and adjust your composition as the lightning shifts, or as new activity starts elsewhere. If you wait until you hear your shutter click or someone else exclaim before looking up, you won’t see the lightning. And monitor the light—your exposure can change by several stops as the storm moves, intensifies, or winds down.
  • Try not to check your captures on your LCD until you’re done (or better yet, until you upload your images to your computer). With the Lightning Trigger (and some other sensors), viewing the LCD requires turning off the sensor, which risks missing a shot (I’m pretty sure lightning waits for me to turn off my sensor), and you’ll also find that many successful captures, especially wide compositions with a relatively bright sky, just aren’t that visible on an LCD viewed in daylight anyway.

Do as I say (not as I do)

Be aware that electrical storms can move quite quickly, so you need to monitor them closely. Sometimes this simply means adjusting your composition to account for shifting lightning; other times it means retreating to the car if the cell threatens your location. No shot is worth your life.

About this image

Gary Hart Photography: Forked Lightning, Point Imperial, Grand Canyon

Forked Lightning, Point Imperial, Grand Canyon

On the first evening of last year’s second Grand Canyon Monsoon photo workshop, Don Smith and I took the group to Point Imperial for a sunset shoot. Based on the forecast we had little hope for lightning, but one thing I’ve learned over the many years of photographing the monsoon here is that the forecast isn’t the final word. We got another reminder of this that evening.

The view from Point Imperial is both expansive and different from other Grand Canyon vistas, stretching east across the Painted Desert and north to the Vermillion Cliffs. As the group made their way down to the vista platform, in the corner of my I thought I a lighting strike far to the north. A second bolt confirmed my discovery and soon we had the entire group lined up with cameras pointed and triggers ready.

With everyone in business, I set up my tripod and attached my Lightning Trigger to my Sony a7RIII. Since this lightning was close to 30 miles away, maybe farther than any lightning I’ve tried to photograph, so I hauled out my Sony 100-400 GM lens and zoomed in as tight as I could. I didn’t have to wait long to confirm that my Lightning Trigger would catch strikes this distant—it didn’t hurt that these were massive bolts, many with multiple pulses and forks.

Everyone was thrilled, so thrilled that it didn’t immediately register that the storm was moving our direction. I started at 400mm, but by the time I captured this frame I was just a little more than 100mm. That’s still a pretty safe distance, but with night almost on us and another cell moving in from the east, we decided to take our winnings and go home.

One final note: If you check my exposure settings, you’ll see that my shutter speed here was .4 seconds, well outside the 1/20-1/4 second range I suggest. But if you look at the other settings, you’ll see that I’d opened up to f/7.1, and had cranked my ISO to 400, an indication that twilight was settling in. Successful lightning photograph is all about contrast, and the darker the sky, the better the bolt stands out, even in a longer exposure. Had we stayed past dark (and lived), we could have jettisoned the Lighting Triggers and used multi-second exposures.

Join Don Smith and me in our next Grand Canyon Monsoon Photo Workshop

Read my article in Outdoor Photographer magazine, Shooting the Monsoon

Workshop Schedule || Purchase Prints


A Lightning Gallery

Click an image for a closer look and slide show. Refresh the window to reorder the display.

2018 Highlights

Gary Hart Photography: Milky Way Reflection, Colorado River, Grand Canyon

Milky Way Reflection, Colorado River, Grand Canyon
Sony a7S II
Rokinon 24mm f/1.4
20 seconds
f/1.4
ISO 12,800

I’ve always struggled with the “top-whatever” end-of-year countdown of my favorite images because the choices are so subjective and mood dependent, and so many images are favorites as much for their memories as they are for their aesthetic value. And coming up with a predetermined number is arbitrary, and inevitably requires choices I don’t want to make and will almost certainly regret later. One year I may have only seven or eight images that thrill me; the next year I might have two dozen. This year I chose 27, and I still have some left to process.

So rather than attempt to rate and rank my images at year’s end, I prefer using them as a catalyst for reflection. Each December I go through everything I’ve processed from the waning year (this year I know of several that would certainly qualify as a highlight but they’re as yet unprocessed) think about the circumstances of their capture.

I remember

This slideshow requires JavaScript.

I remember the New Year’s Eve solo drive to Yosemite to photograph the full moon rising behind, followed by a night drive to the other side of the Sierra (a six hour drive in winter) where I hoped to capture the full moon setting behind Mt. Whitney. The Yosemite part of that trip was spectacular, the Mt. Whitney half was a photography flop, but I enjoyed the entire journey.

I remember nearly a month in New Zealand, photographing the South Island’s unmatched beauty in its most beautiful season (hint: brrrrrrr). In New Zealand I hiked on a glacier, photographed the (far superior) Southern Hemisphere version of Milky Way, was chased through a fjord by leaping dolphins, witnessed one of the most vivid crimson sunrises I’ve ever seen, and logged hundreds of quality kilometers with a group of wonderful people.

This slideshow requires JavaScript.

I remember a solo drive to Yosemite to photograph fresh snow, never a sure thing regardless of the forecast. I approached Yosemite on the evening prior, I felt like a lone spawning salmon fighting up current against the continuous stream of headlights evacuating Yosemite in advance of the storm. I settled into my room in dark and dry Yosemite Valley, and woke to so much snow that I couldn’t find my car. I’m convinced there is nothing, nothing on Earth more beautiful than Yosemite Valley with fresh snow, and with the park mostly vacant and the noise-damping quality of powdery snow, for a few hours I felt like I had heaven all to myself.

This slideshow requires JavaScript.

I remember chasing lightning on the Grand Canyon’s North Rim, the thrill (and relief) when everyone in both workshop groups captured lightning, and an especially spectacular lightning storm that started in the telephoto distances and chased us to the cars. This year’s Grand Canyon workshops were altered by fires burning in and near the park and I feared that they’d spoil the photography—instead, in addition to all the lightning, we ended up with spectacular red-rubber-ball sunrises and sunsets that allowed genuinely unique images in this heavily photographed destination.

I remember arriving on the Big Island shortly after Kilauea had shut down after 35 years of continuous eruption, and discovering that between the just-concluded Kilauea eruption and the recently depart remnants of Hurricane Lane, I’d lost nearly half of my locations. Instead I ended up finding alternate photo spots that I like even better than the ones I lost. The high point (literally and figuratively) of that trip turned out to be a chilly, first-ever sunset and Milky Way shoot from atop 13,800 foot Mauna Kea.

I remember my Yosemite Fall Color workshop group finding Yosemite Valley at peak fall color, and three beautiful moonrises in my just concluded winter moon workshop. And while thousand of photographers jockeyed for position beneath bone dry Horsetail Fall in February, my workshop group set up elsewhere and photographed one of the most beautiful sunsets of the year.

This slideshow requires JavaScript.

I remember way back in January, along with my Death Valley workshop group, photographing my first-ever lunar eclipse (on the heals of my first-ever solar eclipse in August of 2017).

This slideshow requires JavaScript.

And I remember trudging through Grand Canyon sand by starlight to a spot that I’d decided before nightfall was probably not a good Milky Way candidate, and discovering that I was wrong. It turned out the level of the Colorado River level had changed in the night, replacing mushy sand with a swirling pool that rendered the Milky Way’s reflection as a luminous abstract.

Gary Hart Photography: Milky Way Reflection, Colorado River, Grand Canyon

Milky Way Reflection, Colorado River, Grand Canyon

I could go on and on about my memories of 2018, but all these great memories also remind me of the unknown highlights in store for 2019. Certainly the planned trips, which include my first-ever Iceland visit (with Don Smith in preparation for our 2020 workshop), my first-ever Oregon Coast workshop (with Don Smith), another raft trip through the Grand Canyon, a return visit to New Zealand, and on and on. But what excites me more than anything is the inevitable surprises, those special moments that dazzle when dazzling is the last thing you expect. Bring it on!


2018 Highlights

 (Click an image for a bigger view, and to see a slide show)

Stop Being So Negative!

Sunset Lightning, Grand Canyon
Sony a7R III
Sony 24-105 f/4 G
1/5 second
F/9
ISO 400

Lightning (at a safe distance) is pretty cool. It has always fascinated me, partly for the ephemeral power that can explode a tree and disappear before my brain can register its existence, but also because lightning is a rare sight for these California eyes. What what exactly is going on in a lightning bolt? I thought you’d never ask….

The shocking truth about lightning

Lightning is an electrostatic discharge that equalizes negative/positive polarization between two objects. For example, when you get shocked touching the doorknob in your bedroom, you’ve been struck by your own personal lightning bolt. You got zapped because, courtesy of that carpet you just dragged your fuzzy slippers across, you picked up a few extra electrons that the doorknob was more than happy to relieve you of.

While the polarization process that happens in an electrical storm isn’t as thoroughly understood as the one in your bedroom, it’s generally accepted that a thunderstorm’s vertical, convective air motion shuffles electrons in the atmosphere. To jar your high school science memories, convection occurs when a fluid substance heats, becomes less dense, and rises until it cools and becomes dense enough to sink. (You initiate convection when you boil water.)

The is up/down circular flow of atmospheric convection happens when air near the ground warms, expands, and rises. The rising air carries water vapor; since cooler air can’t hold as much moisture as warm air, the ascending water vapor eventually condenses into clouds. The convective motion jostling the water and ice molecules inside the clouds strips the molecules of electrons. Electrons are negatively charged and more dense than the surrounding air; freed of their conventional bonds, these electrons fall earthward. Overhead, the clouds relieved of many electrons are suddenly positively charged, while the ground below has been rendered negatively charged by virtue of its new electron surplus.

Because nature abhors any imbalance, these opposite charges attract each other. The extreme polarization in a thunderstorm—positive charge at the top of the cloud, negative charge near the ground—is quickly (and violently) equalized: lightning! So I guess you could say that lightning is God’s way of telling Earth, “Stop being so negative!”

With lightning comes other atmospheric changes. The sudden infusion of a 50,000 degree electric charge displaces the surrounding air very suddenly, creating an audible compression wave that we know as thunder.

The visual component of the lightning bolt that caused the thunder travels to you at the speed of light, over 186,000 miles per second. But lightning’s aural component, thunder, only travels at the speed of sound, a mere 750 miles per hour (or so)—a million times slower than light.

Because lightning and its thunder are simultaneous, and we know how fast each travels, we can compute the lightning’s approximate distance. (Thunder’s speed varies slightly with atmospheric conditions; light’s speed is non-negotiable.) From our human perspective the lightning arrives instantaneously, but moving at 750 miles per hour, thunder takes around five seconds to travel a mile. So, dividing by five the number of seconds to elapse between the lightning’s flash and its thunder’s crash gives you the lightning’s distance in miles (divide the interval by three for the approximate distance in kilometers). For example, if ten seconds pass between the lightning and the thunder, the lightning struck about two miles away, fifteen seconds elapsed means it’s about three miles away, and so on.

This speed difference also explains why lightning comes and goes in milliseconds, while its thunder can rumble and roll for several seconds. Because a lightning bolt can travel many miles, the thunder from its nearest portions reaches you much sooner than its most distant components.

About this image

Each summer moisture from the Gulf of Mexico makes its way up into the American Southwest. The combination of moist air and extreme heat (to kick off convection) makes August ripe for thunderstorms at the Grand Canyon. For the last six years, Don Smith and I have scheduled two photo workshops hoping to photograph these thunderstorms and their effects (clouds, rainbows, and especially lightning).

Bit with unseasonably dry air in place, the forecast at the start of this year’s first Grand Canyon Monsoon workshop wasn’t especially favorable for lightning. I told the group during the orientation that I wasn’t concerned, that I’ve often seen forecasts like this change suddenly—then anxiously monitored every subsequent NWS forecast update with crossed fingers. In the meantime, we were all quite content photographing incredible smoke effects, courtesy of three nearby wildfires.

By the end of our second day I started seeing hints of moisture returning to the forecast toward the end of the workshop, with each forecast looking a little more promising than the one prior. By day four, the workshop’s final full day, I was downright optimistic.

We’ve always had better lightning success on the North Rim. Partly because the view faces south, the direction from which the storms tend to arrive, and partly because our cabins at Grand Canyon Lodge are right on the rim. Grand Canyon Lodge also has a pair of view decks, shielded by lightning rods, that are ideal for photographing lightning.

The lightning started firing early on our final evening. We all rushed to the rim, attached our Lightning Triggers, and pointed toward the most promising clouds. Much to my relief, it wasn’t long before everyone in the group had at least one lightning image, and most had many more than just one.

But feeling a bit greedy, with nice clouds overhead, and the smoke that had set up camp in the canyon for most of the week suddenly scoured by heavy rain, I realized that all we needed to ignite a sunset lightshow was a little sunlight. I glanced westward and saw signs of clearing. Dare I hope for a sunset to go with this lightning? As if by divine intervention, the sun emerged from the clouds just a few minutes before sunset, infusing the canyon and its diaphanous rain bands with light that started amber and reddened with each passing minute.

When the choice is between a (relatively) bland scene most likely to get lightning, and better a composition with just a slight chance for lighting, I usually take my chances and opt for the better composition. In this case the lightning had shifted a little north of the canyon, but I pointed my camera toward the better light over the canyon and crossed my fingers. So irresistible was the light that while waiting (and not wanting to change my composition and miss a lightning strike), I pulled my a7RII from my bag and clicked a couple of handheld frames due south, where no lightning was possible but the light was especially sweet. (Anyone who knows me will be shocked to hear that I took a picture without a tripod.)

Though several bolts fired during the five or so minutes before the sun disappeared, the one in this image was the only lightning I captured with the great sunset light. But all I wanted was one sunset strike, and I felt extremely lucky that it arrived just as the magenta glow reached its crescendo.

The lightning waxed and waned for several more hours. With the sun down the sky soon darkened enough for me to remove my Lightning Trigger and switch to long exposures in Bulb mode. I stayed until after 10:00, wrapping up with a couple of 20+ minute exposures that captured more than a dozen strikes each.

Grand Canyon Photo Workshops


A Lightning Gallery

Click an image for a closer look and slide show. Refresh the window to reorder the display.
%d bloggers like this: