Playing and Exporting video from Photos on a MBP

Hi all,

I shot a bunch of video using my iPhone 14, and it’s synced via iCloud into Apple’s Photos. It plays back beautifully on my iPhone, as expected. However, when I play it back or export it from Photos on my 2023 MacBook Pro, the video is completely washed out, and looks way overexposed. This is definitely not normal! Any thoughts or suggestions on how to “fix” the issue, so that the video is usable and looks like what I shot?

Cheers

Optimize iPhone Storage is a default. It will save smaller versions of your photos and videos. You can turn it off in Settings; and get ‘Download and keep Originals" instead.

Good suggestion. I’ll give that a try.

I’ve been creating photos and videos on numerous iPhones for decades, and have never experienced what I’m now seeing. The video starts playing normally for less than a second before it becomes completely useless in Photos. It’s like it’s been overexposed by about 5 stops of light.

The playback and export of the video is now working as expected. I have no idea what changed. I do notice that there’s an HDR symbol in the upper left corner of the videos, and perhaps some kind of processing is being done prior to playing back, as I can see the image change when I click Play. The washed out, over exposed issue no longer is.

Turning Optimize iPhone Storage off and Download and keep Originals On are good suggestions. I think what is going on is much more complex. Thanks to Apple’s minimalist policy for software documentation, we are all stumbling in the dark together here. My comments are essentially my impression, derived from extensive experience and a little knowledge. Your experience is one more data point to help us all understand the overall operation of Photos.

To facilitate its operations, Photos creates various versions (thumbnails, etc.) of items it stores, as well as metadata structures of content and complexity unknown to us users. The raw data of individual items is analyzed by background demons to discover content, and create a digital format which can be further analyzed. Patterns discovered linking groups of digitized media leads to new metadata structures. Analysis of all the library and all its data is an ongoing, iterative Machine Learning process, which takes much computation. Video, including for example issues related to HDR, codec (H.264 versus H.265), color space, etc., requires a much more complex Photos library data structure.

To facilitate user privacy, reduce the load on Apple servers, and increase its competitive moat versus systems not powered by Apple Silicon, macOS runs these calculations on the users’ hardware. To reduce users’ complaints that Photos and its various demons dominate background processing, macOS lowers their priority, extending the time to complete initial digitalization of a new data point. I think this led to the temporary issue you experienced. The video structures needed to properly view your video stored in iCloud had not yet been created.

A guess is you might be able to view your video sooner by focusing your Mac’s processing on it. Open Activity Monitor so you can observe what processing activity happens. Rotate the video 90 degrees. When the activity of VTEncoder and VTDecoder subsides, rotate back to the original orientation, and again wait for VTEncoder and VTDecoder to finish. You may still have to wait for some background processes to complete, but with the heavy lifting of creating multiple forms of the video done, the rest should follow relatively quickly. If you try this, let us know if it works and what you observe.

1 Like