Posted on August 29, 2025
Posted by John Scalzi


For the new release of the Pixel 10 Pro (and the 10 Pro XL, which is mostly the same phone, just larger), Google has introduced something called the “Pro-Res Zoom,” a process by which, once you zoom in with the camera over about 30x zoom, after you’ve snapped the photo, Google will run it through an “AI” processor, not to bring out the details that are actually there, but to make up details that seem reasonable to assume are there, based on whatever processing algorithm Google is currently using. It then outputs the result of this guessing into your phone, alongside the original photo. Sometimes it looks pretty good! Sometimes it does not! But in neither case is what’s being outputted a photo. Rather, you now have a picture, or an illustration, based on a photo. It’s no more a real photo than it would be if someone made a cartoon version of the photo. The verisimilitude at that point is the same.
Which is not to say that the Google Pixel 10 Pro can’t do a reasonably good job at approximation sometimes. Look at the before/after images of the strawberry above. The “before” version on the left is an unimproved photo, shot at about 50x zoom from across my backyard deck. The strawberry is blotchy and low-detail, which is perfectly reasonable, because the Google actual optical sensor only goes to 5X zoom and everything else is a digital zoom, i.e., it crops in and uses a much smaller number of pixels to resolve the image. The image on the right is the “AI” recreation of the fruit. It looks much better, because Google “knows” what a strawberry is supposed to look like and builds on that. It does a good enough job that you can believe it actually is a photo – a heavily processed one, but one bearing some relationship to reality. That’s because as blotchy as the initial image is, it has enough detail that Google can reasonably extrapolate. That’s a strawberry, all right!

But the extrapolation breaks down, and quickly, when the details aren’t there. You can’t just “enhance” your way to clarity. This image of the end of my road, shot at about 94x zoom, makes the point: Stop signs aren’t circular, and the “STOP” letters aren’t there at all, replaced by white splotches. Overhead wires hang weirdly and disappear randomly. It’s an illustration, and not a particularly good one.

The model’s inability to resolve letters gives a feel like when you’re dreaming and you’re trying to read signs and you can’t because the letters don’t resolve and they squirm around. This is an exactly apt metaphor, because these pictures aren’t reality, they’re a hallucination, only by a computer and not a human mind. I don’t hate it! I think the dream-like squiggles and weird simultaneously over-and-under-detailed images from the Pro Res Zoom can be aesthetically intriguing. But it’s not what I want a camera on my phone to do. I want it to take photos, not generate related-but-ultimately-fictional illustrations.

Below the point at which the Pro Res Zoom kicks in, the Pixel 10 Pro does take perfectly lovely photos – there is algorithmic processing there, too, but its dedicated to fixing light balances and choosing how to represent color and so on, which is to say, all the things that any digital camera does (see the photo above, of the actual strawberry from before, as an example). Google’s Pixel phones have consistently had some of the best cameras in the field, as much due to the software as the hardware, and that’s one of the reasons I’ve stuck with the brand when it comes time to upgrade.
To be fair to Google, it has built-in support for C2PA (Coalition for Content Provenance and Authenticity) Content Credentials, which means that when you use Pro Res Zoom, or any of the Pixel 10 Pro’s other “AI” editing tools, the fact that the image is “AI” generated/edited is embedded in the metadata. Google isn’t trying to fool anyone about what it’s doing. Of course, it’s not that difficult to strip metadata, and not everyone knows how to find that metadata anyway (do you?). I’m not going to fault Google for that. They are at least making the attempt to be clear what’s happening when you use their “AI” tools, and I can appreciate the effort.
With that said, for my own part I’m unlikely to use the “Pro Res Zoom” much; I do like my photos to be actual photos when they come out of the camera; if they’re going to be edited, I want to be the one to edit them, so I can be fundamentally responsible for the images I’m presenting to others. As for everyone else, well, look: There’s an upper physical limit for phones on lenses and sensors, and phone manufacturers have been filling in those gaps with software for years. Google is maybe the first to do that with one of their zooms, at least on this scale, but it’s very unlikely they are going to be the last. We’ll see more of this.
As with everything else you see on the Internet and off of it, you are going to have to be the one who makes the call about whether you believe what you see with your own eyes, and whether what you’re seeing is a photo, or just a picture.
— JS
Source link