Tips for better action shots with an iPhone 11 Pro?

I’ve been taking pictures of cross-country and trail races this fall, first with my iPhone X and now with the iPhone 11 Pro. And frankly, I’m unimpressed. Let’s assume for the moment that the problems are my fault and if I only knew how to do something better, the photos would come out better. Here’s what I do.

First, since the runners are moving fast and I neither want to affect their race or potentially get run over myself, I pick what seems like a safe distance and use the 2x optical zoom. The iPhone is handheld, and light is generally pretty good, though I have no real control over that, and when it’s a little darker or a cloud goes over, the quality suffers. On the iPhone X, I always used burst mode to capture multiple photos. With the iPhone 11 Pro, the new press-and-slide-left trick for getting burst mode is hard to use in the moment, so I mostly just took single shots. (And I lost one great shot of a father and daughter running into the finish together because of ending up in video mode—amazingly, there’s no way to extract a still image from a video using built-in tools, so I was forced to take a screenshot of the video.)

Once I’m home, I go through the burst mode shots and pick the best one, often choosing based on stride form, facial expression, or race position (who’s behind, etc). Then I apply the Auto adjustment, which usually improves things a little, and crop the photo, sometimes heavily. The cropping is necessary because even with 2x zoom, I can’t get nearly as close as would be ideal. It’s also necessary to focus on the primary runner (such as the person on my team, as opposed to others in the race) and to set up some minor bit of narrative (have them in front of someone else instead of showing that they’re behind another person, or having them coming out of a trail with a group well behind, etc).

But thanks to the fact that they’re moving fast, the camera has trouble focusing on them, and when I zoom and crop, any blur becomes more pronounced. I’ll see how the iPhone 11 Pro works in tomorrow’s PGXC race, but if I had to say based on today’s trail race, it’s better at avoiding blur than the iPhone X, but still ends up with really grainy photos.

PGXC Race #1 (iPhone X):
PGXC Race #2 (iPhone X):
Danby Down & Dirty (iPhone 11 Pro):

The main thing I’m going to try differently tomorrow is using a monopod, assuming the tripod mount that can expand large enough to hold an iPhone 11 Pro arrived from Amazon today (the previous one I had maxed out with the iPhone 5-size phones, something I didn’t realize until I tried it the other day). That will reduce stabilization problems, but may be too clumsy for real-world usage at a race.

Any other suggestions for improving these action shots? Thanks!

You’re definitely pushing the limits of the iPhone cameras. Shooting sports and action is hard, and to do it well really requires better hardware. Ideally, you have a zoom lens to get closer, and a DSLR or mirrorless that can track subjects in real time. Recent Sony models do that, as does the Fuji X-T3 (which is the model I have). I believe the latest mirrorless Canon and Nikon models do, too.

However, you clearly don’t want to buy/lug/deal with a bigger system, and that’s okay. Sports/action photography is a field where just getting a better camera won’t make you a sports photographer.

Looking at the shots, it seems like the iPhone isn’t having trouble focusing: the trees in the background are all nice and sharp. :slight_smile: But that also tells me that the blurriness isn’t being caused by camera shake or user error. The phone just isn’t nailing the people, which of course is what you’re focused on.

What might help is your involvement with the focusing. I’m guessing these folks are coming by at a pretty quick clip, but you may be able to tap to focus on them and fire off the exposure real quick. Having the camera mounted on the monopod might be helpful, because you can use one hand to tap-to-focus and the other to take the shot (use one of the volume buttons to trigger the shutter), and that way you’re not also trying to stabilize the phone.

I do wish there was a better way to invoke burst mode in iOS 13. Tap-drag isn’t a good solution when you’re shooting fast like this. I saw someone online suggest mapping the volume buttons; holding volume-up could record video, while holding volume-down could capture bursts. But I don’t know how that could be done without Apple making it a setting somewhere.

One other possibility: use a manual app like Halide that will let you set a narrow aperture, such as f/8. If there’s plenty of light, that keeps more of the image in focus. The iPhone 11 Pro’s default is f/2, which is great for letting more light in, but results in a shallower depth of field. You could even try to set a high aperture and a manual focus setting, and take the shot only when the runner is crossing a particular spot, but I suspect that might lead to fewer shots in focus overall.


Buy a GoPro (the 8 just came out), video, and select frames from the video? The stabilization on the 8 is apparently extremely good. GoPro, being made for this very thing, is likely to do a better job than an iPhone. Can’t say anything about the picture quality, though, since I don’t have a GoPro. (I have Garmin VIRBs, which I wouldn’t recommend at this point because Garmin has done nothing new in action cams in several years.)

The GoPro has a very impressive stabilization (I have the GoPro 7 Black, the new 8 should have even better stabilization).
But: it has a very wide-angle objective, which is good for landscape photography. But for close-up photos you have to go very close to the subject. And it has quite some shutter lag. Photo quality is good, but GoPros are better suited as video cameras.

I’ve been reading about the Deep Fusion feature for iPhone 11 models that’s soon to be released from public beta, and it sounds like it has great potential for optimizing stop motion stills by combining multiple images and then sharpening:

"How Deep Fusion technology works

Both Smart HDR and Deep Fusion capture multiple images and then the iPhone picks a reference photo that’s meant to stop motion blur as much as possible. It then combines three standard and one long exposure into a single “synthetic” long photo. Here is when Deep Fusion comes into play. Deep Fusion breaks down the reference image and synthetic long photo into multiple regions, identifying skies, walls, textures and fine details (like hair).

After this quick breakdown, the software does a pixel-by-pixel analysis of the two photos – that’s 24 million pixels in total. Lastly, the results of the analysis are used to determine which pixels to use and optimise in building a final image. This whole information is captured and processed in the background once iPhone’s A13 processor has a chance.

Looks complex? Well, Apple says that the entire process takes a second or so to happen. This means you don’t need to wait before clicking next shot.

There are some gotchas. You can’t use this with your phone’s ultra-wide angle lens, as hinted earlier, and bright telephoto shots will revert to Smart HDR to maintain better exposure. The capture process is quick, but it’ll take a second for your iPhone to process the image at full quality. And yes, you absolutely need a 2019 iPhone for this to work – it’s dependent on the A13 chip.

Since there has been quite a lot of discussion about Apple choosing to develop its own chips, it’s quite clear the A13 is responsible for great photo enhancements including Deep Fusion and Smart HDR that are keeping iOS cameras way ahead of the competition.

Though I’m not going to upgrade my 8+ until 5G is well established, I am quite tempted by stuff like this.

Thanks for the advice, @jeffc. In an effort to improve focusing, I downloaded and used Camera+ 2, which features an action mode that supposedly does better focus tracking than the built-in app. It also has a burst mode that works like Apple’s old approach, meaning it’s useful. I also used a monopod to increase the stability of the iPhone.

Here’s are the shots. They’re better, I think, but still not exactly stunning.

PGXC Race #3 (iPhone 11 Pro):

I’m not interested in buying another camera at this point, since I don’t plan to make a hobby of taking these shots—once this cross-country season is over, it will be some months before I’d even have a chance to take more action shots. And from what @mac-list says, I suspect the GoPro isn’t the answer for this sort of photography, much as I’m sure it’s great for action videos.

We can always hope, although I wonder if third-party camera apps will be able to take advantage of it, or if it will be restricted to the built-in app. Given how bad the burst mode is in the iPhone 11 Pro’s Camera app, I doubt I’ll be using it for action shots again.

I’ve been photographing various events for years. My phone is an SE and I have a decent pocket point and shoot (and it was more than decent when it was new). The only way I could freeze the shot was if I took it at an angle with the object coming towards me. Panning was reasonable but that has its own set of issues.

In 2013 I bough a Canon SX50 which improved action shots quite a bit. Still not perfect as it’s not a high end camera, nor is it good in low light, but better than the phone or my smaller Canon.


How about going old school? Before the race, put something in the spot that the runners are going to be when you shoot, focus the phone on that using AF Lock (press and hold on a focal point until AF lock appears up top), remove the item you’ve used to focus, and then wait for the runners to come into the preset focus area. It’s a bit jury-rigged, but it might work better.

Thanks! I’ll test that before the next race, but I fear that the runners are moving too fast for me to necessarily capture them at the right moment. Plus, the beauty of burst mode is that I can choose the best image from 3-10 photos as the runners move through the camera’s field of view. I’m often choosing based on stride, facial expression, where they are in comparison with another runner, and so on, so being able to take in-focus photos of only one precise spot might give me a focused photo that doesn’t meet any of the other desirable variables.

Aha! If you combine setting a narrow aperture as JeffC suggested, so that the depth of focus is pretty deep, with my method, you’d have an area of focus rather than a point.

Alternatively, just tell the runners to stop actually running, hold a pretend action pose, and take a picture.


Worth a try!

They all like me, but I think I’d still be tarred and feathered after the race. Or perhaps just mowed over. :slight_smile:

You may well be aware of this but if your editing is not necessarily required on the spot you can achieve this in the Photos App by using the “Export Frame to Pictures” option. I have been more than happy with some of the resulting pictures and it certain provided me with a better result than a screenshot.

1 Like


The 11 Pro photos are much better.
Both is clarity and color.

Where is this option? Are you talking about the iOS Photos app or the Mac Photos app? I can’t find it anywhere.

But alas, Deep Fusion won’t work with burst mode, according to The Verge.

I apologize for neglecting to add this would be on mac os in the Photos app.

Finally found it! It’s in the gear menu controller for when you’re playing a video. And even more freakily, Photos extracts the image to a TIFF file stored in your ~/Pictures folder. This is a mind-bogglingly horrible interface and user experience—Photos is a %^&$^% photo app, so why can’t it at least extract the frame to Photos itself? The feature seem to be the same in Mojave and Catalina.

As a reward, you get to see a screenshot of the super cute photo that I was able to extract from the video, showing the relevant menu.