Tracking My iPhone Camera Usage With a Deep Dive into EXIF Data

Originally published at: Tracking My iPhone Camera Usage With a Deep Dive into EXIF Data - TidBITS

I “downgraded” from the iPhone 16 Pro to the iPhone 17 this year because there was only one unique feature of the iPhone 17 Pro that interested me: the 4x/8x zoom courtesy of its Telephoto camera. In the end, I decided that it was more important for me to experience the plain iPhone 17. I don’t think that I regularly used the iPhone 16 Pro’s 5x zoom, courtesy of its Telephoto camera and tetraprism lens, because I’m more interested in macro photography. The photos that make me the happiest are the extreme close-ups I get by stuffing my iPhone into a flower to capture the alien-looking stamens and pistils surrounded by the glorious colors of the petals.

Peony

While making that decision, I wondered how many photos I’d taken in the last year with the iPhone 16 Pro’s 5x zoom. I poked briefly at smart albums in Photos but couldn’t figure it out at the time, so I proceeded with the move to the iPhone 17 without that data.

However, while chatting with Allison Sheridan about our next Chit Chat Across the Pond podcast topic, I commented that I had downgraded because I didn’t think I used the Telephoto camera much and felt that the iPhone 17’s new 48-megapixel Ultra Wide camera would do as well as the iPhone 16 Pro’s for macro shots. She said that she loved her iPhone 16 Pro’s 5x zoom and would give up the Ultra Wide camera and its macro photos in a heartbeat for the more capable optical zoom. That’s when I mentioned my failed smart album efforts, with which she had also struggled.

And then I went down the rabbit hole, emerging a day later with an interesting podcast discussion and this article.

ChatGPT and ExifTool

Part of the reason I had given up on using a smart album in Photos is that, after my initial experiments proved fruitless, I asked ChatGPT, and it was negative about that approach. Instead, it recommended Phil Harvey’s free ExifTool, a command-line tool for reading, writing, and editing metadata in numerous formats, including the EXIF format, which documents various camera-specific bits of metadata about every iPhone photo.

Although ExifTool is a little intimidating to install (it’s not signed, so macOS tries hard to prevent you from installing it for fear that it’s malware), it’s easy enough to use when ChatGPT provides commands to copy and paste.

Photos info windowAfter some back and forth, I discovered that part of my confusion with Photos is that it identifies the images taken with the iPhone 16 Pro’s Telephoto camera as having a 35mm-equivalent focal length of 120 mm, whereas ExifTool revealed that the actual focal length, as shown by the EXIF FocalLength field, was 15.7 mm. Similarly, the Wide camera reports a 35mm-equivalent focal length of 24 mm, but the actual focal length is 6.8 mm. The Ultra Wide camera reports a 35mm-equivalent focal length of 14 mm, corresponding to a true focal length of 2.2 mm.

It makes sense that Photos would report the 35mm-equivalent focal length that photographers understand, even though the actual focal lengths are much smaller, as they must fit within the iPhone’s frame. However, when you’re building a smart album in Photos, it only knows about the actual focal lengths, not the 35mm equivalents. But I didn’t discover that until much later.

Once I could use ChatGPT to build complex ExifTool commands, I tasked it with counting the number of images from the last year that were taken with each of the three cameras. Here’s where the rabbit hole took a sharp turn downward, because I have nearly 49,000 images in my Photos library, and ExifTool isn’t quick at parsing them.

However, ChatGPT, ever helpful, offered to write a shell script that would accomplish what I wanted using techniques that would speed things up. In particular, it suggested finding all the images from the last year using the find tool, so ExifTool could examine the metadata from that subset rather than the full collection. And it did! It required a good bit of back and forth to get it working the way I wanted since I’m barely functional with shell scripting, but chatbots are nothing if not patient.

I shared the script with Allison, who was able to get it working after granting Full Disk Access permissions to Terminal. But as we pored over our respective numbers on the podcast, it became clear that they were, if not wrong, at least misleading. And probably wrong. Part of the issue was that the iPhone 16 Pro’s Wide camera can be used for regular shots and 2x zoom photos, and the Ultra Wide camera can be used for wide-angle 0.5x zoom pictures and macro close-ups. Plus, there were selfies that used the front camera, and a surprising number of photos—for both of us—that weren’t taken with the iPhone 16 Pro at all. Those came from iPads, old iPhones, and other people who had shared photos with us. Luckily, ExifTool can extract the appropriate focal lengths to identify each type of photo.

Verification was clearly necessary, so I came up with the idea of creating folders for the different types of photos and populating them with symlinks to all the matched photos. I figured it would be easy to use Quick Look to scan through a folder of symlinked images and see which ones didn’t belong. To keep the test runs quick and the scanning manageable, I also added date handling so the script could limit its work to a specified period. ChatGPT had no problem modifying the script as I directed, though, as always, quite a bit of back and forth was necessary to work through mistakes and incorrect assumptions.

Buckets from the script showing Telephoto shots

The folders were brilliant for revealing errors in the script. The most notable finding was that the script identified over 10,000 photos, which was significantly more than I had taken in the last year. The problem is that the find tool was looking at the last modified date for the original images, and Photos apparently tweaks that date after touching old images in some way. However, ExifTool can read the actual capture date. When ChatGPT rewrote the script to use find for the first pass and then have ExifTool identify only the photos from the last year, the total number dropped to about 1,500. In the end, this is what my script reported.

Interestingly, one photo was still completely wrong. For reasons I can’t explain, it somehow ended up being identified with the EXIF Model being “iPhone 7” while the LensModel was “iPhone 16 Pro back triple camera 2.22mm f/2.2.” Metadata corruption?

What About Those Telephoto Pictures?

As you can see in the screenshot above, I had 386 photos that relied on the Telephoto camera’s 5x zoom, but only 265 true macro photos from the Ultra Wide camera. Does that mean I made a mistake in not getting the iPhone 17 Pro?

No, and here’s where my folders were useful again. For me, Telephoto pictures fall into two categories: photos I (or a friend) take at my team’s cross country races and wildlife shots where I couldn’t get any closer to the animal in question. Unfortunately, every model of iPhone has always sucked at taking race photos. I need the zoom so I can fill the frame with the runners (and I usually crop the images even more afterward to focus on the people I care about), but even when it’s a sunny day, the shutter speed isn’t fast enough to capture images without some motion blurring. (I use the Camera+ app in its Action mode to take bursts at 2x zoom; the standard Camera app makes that too hard in stressful situations.)

Zoomed photos of Tonya racingThese photos of Tonya were actually taken in 2023 with the iPhone 15 Pro, since the pictures of her in 2024 using the iPhone 16 Pro were less flattering. But you can still see how blurry they are, particularly the one on the right.

So, although I have taken a lot of race photos, and they’re better than nothing, I’m always disappointed by how much better they could have been. My wildlife photos are also often disappointingly blurry due to motion from the animal or the camera.

The Script Versus Smart Albums

I’m uncertain how generally useful my script is, since it’s focused on identifying images from the iPhone 16 Pro’s different cameras, and only secondarily counts photos taken with other devices. Nevertheless, you’re welcome to download it if you’re in a situation similar to mine or want to modify it for your own use. Don’t worry if you aren’t a shell scripting expert, because any chatbot should be able to help you install ExifTool, make the script executable, and adjust it for your iPhone model.

Alternatively, armed with the knowledge I gained from working with ChatGPT on the script, you may be able to learn everything you need by creating smart albums in Photos instead of fiddling with the command line. This approach can collect all the photos taken with each of the iPhone 16 Pro’s cameras, but I was unable to extend it to differentiate between regular and macro shots taken with the Ultra Wide camera and regular and 2x zoom shots from the Wide camera.

The key, which I alluded to earlier, is to use the Lens condition to match the actual focal length of each camera rather than the 35mm-equivalent focal length that appears in the Photos interface. That’s more easily said than done, although you don’t have to know the specific text to match. If you start typing in the Lens condition’s field, an auto-complete pop-up with many possible options appears.

Tricks with smart album Lens condition

Unfortunately, it isn’t wide enough to display the full name of many of the options, so you have to hover over those to see the expanded name, which has more significant digits than what ExifTool reports for FocalLength. Even more confusing, you may see separate entries for “iPhone 16 Pro back camera” and “iPhone 16 Pro back triple camera.” The “back camera” term matches videos and, in my case, photos taken with the Camera+ app in its 2x zoom mode. The “back triple camera” term matches photos where I explicitly used the 5x zoom.

The practical upshot of all this is that you may need to match the Camera Model separately and then match just the focal length for the Lens condition—select the desired autocomplete entry and then delete the camera-specific text. My completed smart album for finding photos captured with the Telephoto camera looks like this, and you can easily modify it to pull out photos captured with the Wide and Ultra Wide cameras.

Smart album to find iPhone 16 Pro Telephoto shots

If you want to be complete and see how many selfies you’ve taken, that’s easier—just use the Photo is Selfie condition. Both the iPhone 17 and iPhone 17 Pro have Apple’s new 18-megapixel Center Stage camera for selfies, so those should improve regardless of which you choose.

Smart album to find iPhone 16 Pro selfies

What you can’t do in a smart album is identify the actual focal lengths used by different photos using the same lens. In contrast, my script can differentiate between regular photos taken with the Wide camera and those that use the 2x zoom, as well as distinguish between wide-angle images taken with the Ultra Wide lens and those that are macro shots. My belief is that Photos is looking only at the EXIF FocalLength field and ignoring the FocalLengthIn35mmFormat field used in the script. In an ideal world, Apple would give Photos smart albums an EXIF condition that would allow users to build complex rules matching any available EXIF fields.

Ultimately, I’m not sad to have opted for a less-expensive iPhone 17 this year, and I’m curious to see whether I miss the Telephoto camera throughout the coming year. If you’re trying to decide between the iPhone 17 and iPhone 17 Pro when upgrading from a previous iPhone Pro model with a Telephoto camera, create a smart album with a Lens condition that matches the actual focal length of your 3x or 5x zoom photos, and then look at the images and ask yourself, “Do they make me happy?”

6 Likes

Slightly off topic here but you can use the Nitro Photo app to view the complete EXIF data for images (from the Photo Library or the Mac file system). This is a one at a time mode but I use EXIF data all the time to figure out what the camera is doing (mostly “real” cameras but also iPhone cameras). (This is done via the EXIF API access.)

1 Like

Amazing dive into details but I would have thought it would be easier, say with Nitro, Graphic Converter, Aperture or Photos. Ack!

Admire your persistence @ace!

Photostatistica on the App store does this automatically with multiple sorting criteria and graph types.

4 Likes

Oh my, PhotoStatistica looks like huge fun. If I hadn’t already answered my question, I’d spring for the $4.99.

Adam-

In re: race shots…
Get a real camera, dude!
In my case “race” shots are auto racing and iPhones suck for that. I use my mirrorless digital cameras for that.
:upside_down_face:

I made the same decision, downgrading from iPhone 16 Pro to regular 17, and for the same reasons (although without any data). I just felt that I didn’t regularly take photos that required the Pro. I’m happy so far. (Except I’m a long-time Apple Upgrade Program person, and for the life of me I can never figure out where/how to return my old iPhone, and the folks in my local Apple Store didn’t seem to know either. I’ll call Apple Support tomorrow, since I upgraded last Monday and the clock is ticking.)

Yeah, if I took race photos regularly, I’d have to, but it’s something I do only occasionally. Plus, I’m irritated that Apple shows off fancy action shots that must be a combination of very carefully controlled situations and luck.

Adam-

| Adam Engst ace
September 29 |

  • | - |

Yeah, if I took race photos regularly, I’d have to, but it’s something I do only occasionally.

Nikon has some nice, reasonably priced mirrorless cameras. Also, it still sells a few DSLRs, too.

Plus, I’m irritated that Apple shows off fancy action shots that must be a combination of very carefully controlled situations and luck.

I feek the same way. It’s ads feature amazing stills and video footage–all under “perfect” conditions.

Another issue that irritates me is phones record images at 72-dpi which is useless is someone is shooting for print, either something that’s “poster size” or for use in a magazine. Sure, you can change that with P’shop or Affinity Photo but there are always compromises in doing that vs. having the original image hi-res.

Good emailing with you.

I’ve been a TidBITS follower for so long, I can’t remember when I signed up.

Regards,

Hib Halverson
General Manager
Shark Communications

Another issue that irritates me is phones record images at 72-dpi which is useless is someone is shooting for print

DPI (dots per inch) has nothing to do with resolution (the number of pixels in the image).

You can change the DPI setting and keep the resolution the same, which has zero effect on the quality of the image. For example, a 300 pixel wide image at 300 dpi is 1 inch wide and at 72 dpi it is 4.16 inches wide.

I have blown up many high resolution iPhone photos to poster size with no issues.

2 Likes

6 posts were split to a new topic: How to return a traded-in iPhone as part of iPhone Upgrade Program?

I really was into your dive with EXIF and ChatGPT but knew the reality that Apple is not a camera company. They sell a phone with a large sensor, a physically small optical lens setup, and market “optically equivalent to a 5x or 8x or 10x zoom lens”. There is something to say about crop of a 12MB pixel sensor, crop of a 48MB pixel sensor and full 48MB Fusion. (Fusion? what is that word? Remember their slight of hand memory trick with Fusion Drive - left over SSD and 1TB with some sorcery in code to speed up aka cache frequently used files? Or Unified Memory aka shared memory which in the old days was video ram used physical ram so that 8MB of ram meant 2MB was for video. Clever tricks there Apple!)

Atleast the new iphone 17Pro can allow for RAW image. Something without compression and something of needing a larger storage sized model (like 1TB). If you take alot of pictures. Or like me, will be Airdropping or USB-C’ng external storage much.

To me, a zoom lens is 100-400mm, or 80-200mm or 24-80mm, at 2.8f or even lower. (Those 5.6f lens are usually much cheaper than 1.4f aperature). An iphone can take images, but an iphone is not a professional DSLR. Apple just makes it look that way. And the EXIF showing 2.2mm … I wonder if that is the true lens size. And then there is the optical issues like blur, focus, light, grain, and response.

Case in point, try this test. If you have a window in your house that has an outside screen (older storm window screen or newer one-piece need-a-ladder to remove or can from inside). Now, with window closed, try to take a picture of a subject outside, in the distance. The screen will affect the phone imaging. If you have a DSLR, you can zoom and focus (without pinching, a clumsy slow, unstable process on the iphone or any phone) on the subject and the screen may be so invisible because the sensor is your eye in focusing (manually…though some DSLR autofocus will ignore that screen).

I had to take an image of a scofflaw, through a window (think like LE and surveillance) and at the very moment of the infraction, the quickest option to photograph was with my iPhone and every photo was ruined with the window screen. I could not remove the screen. Had I a DSLR, and zoom lens, the photo would have been used in court. I mean your eyes focus beyond that physical screen. Your brain ignores that subtle blur because you are focused beyond that.

Any way, I too, like to use the iPhone camera for close ups because things like serial numbers, print that is light grey on white, and other difficult resolution print for those over 40yrs old, need the phone’s optics. And none of those images will win awards.

I was planning to upgrade to the iPhone 17pro, which would be about $800 more than what I paid for my current phone, and about $1500 less than a Sony or Nikon DSLR. However, not one of 12 nearest (100miles radius) Apple stores have stock of what I want until Oct 21st! I can’t just walk into an Apple store, ask for an iPhone 17pro, 256GB, colour doesn’t matter but unlocked for any carrier does, and plop my titanium Apple card down and buy one.

So then I thought, should I go to last year’s iPhone 16pro? Can’t get that either, unless refurbed through Amazon for $900…which for $200 more get a new iPhone 17pro. Which I can’t unless I wait two weeks more. Ugh, Guess I’ll wait till there is supply. Its been almost a month? No, its been sixteen days since its available release on the 19th…anyway, some good app tips here, as always TIDBITS!

Apple does like to go on about pro uses, but I don’t think any photographer really thinks there’s a valid comparison between a good camera and the iPhone.

The iPhone as a camera is interesting in many regards, primarily where the state of play with computational photography happens to be. And it’s just handy. Sometimes I’ve a quicker, better result with an iPhone out socially at night than I would with a mirrorless or DSLR, the math beating optics in those situations in ease of use. But I wouldn’t be thinking of them as photographs, more like snaps.

I do think that in video the iPhone has become more important, the sensor size increases have mattered more there given the relatively low resolution of 4k video, the computational advances had more impact. The iPhone can fit in all sorts of places a pro video camera cannot and how multiple iPhones can work quickly and cheaply feeding to an iPad is a thing to behold. You can definitely use it in a professional situation for the right material with the right final outcome. But there’s limits to it, always. Even with better cameras, I did use my Fuji GFX100 to shoot 4k raw, a night shot, a pickup for a feature film, destined for the cinema, and some of the shots had artifacts that the main ARRI35 we used would never have had. You always need the right kit for the right outcome.

2 Likes

I decided to purchase PhotoStatistica today. It is able to import EXIF data for some, but not all, of my images. See comparison of imported EXIF data versus the number of images in my Lightroom catalog:

I have submitted email to the developer to see if we can figure out the import problem.

Otherwise, the app seems to be very useful.

1 Like

Update: after several back-and-forths with the developer he was able to identify the bug and has updated the app. It now works fine.

Now that it imports all my images I have a better idea of what focal length, ISO, etc. I am using. Not surprising, it varies by camera as I use each of them in different ways.

1 Like