iPhone XS camera review: Complicated tech for simpler photography
Every time you take a picture with the new iPhone, you're triggering trillions of operations.
We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›
Learning the art of photography has always involved at least a little math. Whether it’s the relatively straightforward calculations involved with figuring out your exposure settings or the nearly inscrutable jumble of numbers required to navigate the settings on an old school flash, there were always numbers behind the process. What the hell is a foot-candle anyway?
Modern cameras, however, involve even more mathematics than anything from the past. In fact, Apple’s new iPhone XS Max camera is doing “trillions” of computations, but you, the shooter, never really see any of them. In fact, it’s only thanks to a new feature (which has nothing to do with the actual mechanics of photography) that the iPhone camera even mentions a once-fundamental photographic concept like f-number, which tells you how much light your lens can let in through its aperture.
I’ve been shooting with the iPhone XS Max camera for a few days and while it’s certainly one of the best smartphone cameras I have ever used, it took a little work to get my brain to abandon its traditional camera habits and embrace the future of computational photography. We’re in a world where taking one photo really involves taking many in rapid succession and letting a computer cram them together into a single shot. It’s a concept that changing what it means to look for “good light.”
The tech inside
On the spec sheet, the new iPhone camera doesn’t seem profoundly different from the hardware found inside the original iPhone X. It’s still a dual-camera setup with one wide-angle lens and a secondary telephoto lens that gives you a more zoomed-in field of view.
The imaging sensors that actually catch the light for the image are now slightly bigger than they were in the previous model. Apple doesn’t say exactly how much bigger, but the resulting change has made both cameras slightly wider in terms of field of view. For the camera nerds: The wide lens now acts like a 26mm lens, while the telephoto lens is now just 52mm.
The resolution stays at 12 megapixels, but each pixel on the sensor is now deeper to capture light more effectively, which is important when trying to shoot in low-light. That’s an area in which these relatively tiny sensors have always struggled.
The most impactful piece of hardware when it comes to overall camera performance, however, is the new image signal processor on the phones A12 Bionic processor that powers its new Smart HDR tech.
Computational photography
Every time you take a photo with the iPhone XS Max, you’re actually capturing several images. First, it saves several frames from before the moment of the button press from a buffer that’s constantly running in the background. Then, it takes even more photos, including the main reference frame and an image with a longer exposure time to try and capture extra details from the shadows. This is an example of computational photography in which the processing engine combines raw image data into the final photo.
The processor then takes that mass of data and crunches it together into a single image file. It’s a far cry from the once-mechanical process of opening a little door in front of a piece of film and letting in light for 1/60th of a second.
Multi-shot HDR isn’t new in and of itself—and it has been the standard way to shoot with the iPhone since the iPhone X first debuted—but now the system is looking for discrete elements in the frame, like faces, which are always important, or blur nebulous lines that could indicate blur. Then it’s trying to fix them.
The results are images that look bright and vibrant, but if you’re used to images from a traditional camera or a DSLR, they take a little getting used to. The shadows have more detail, but since they’re not as dark, they sometimes also lack impact of a nice dark section of the image.
A photo with a bright blue, cloudless sky typically means dark, hard shadows and abundant contrast, but the iPhone brightens up those dark areas to the best of its ability. The reward is lots of detail, but the cost is that sometimes things look more like a screenshot from World of Warcraft than a typical photograph.
Portrait mode
The upgraded photographic algorithms in the iPhone XS Max have also improved the Portrait Mode feature, which applies blur to the background of an image to mimic the look of a professional lens with a fast aperture.
We first met Portrait Mode back in the iPhone 7 Plus and it has absolutely come a long way. In fact, now the iPhone XS Max allows you to tweak the amount of blur that you add to the background of your photos after you shoot them. Just like you can brighten up a shot in post, you can now make the background sharper or blurrier depending on your tastes.
In many cases the effect works rather nicely. Shooting a model on a scenic pier at golden hour in New York City, it was great. Putting a subject against an otherwise-distracting background also makes a good use case for this feature. It has some of the same benefits of a truly fast lens with a wide aperture to let in lots of light.
But, again, there’s a disconnect. When you put something in front of your subject—a common framing technique used by portrait photographers—portrait mode won’t grab onto it and blur it to match the background. I totally get why, but it’s another adjustment for me when framing a portrait.
I find it particularly interesting that Apple chose to label its adjustable blur modes with an f-number. While it’s typically true that a smaller f-number translates into a shallower depth of field (and more blur) it also typically has other effects on an image, like darkened corners and shorter exposure times. So, while it’s fun to say you can “change your focus” after you shoot, it’s not really the case. You can add or subtract more blur, but you can’t create a sharp image if you missed focus no matter how far you push the slider.
It’s worth noting that you get the same portrait blur effects from the 7-megapixel front-facing camera as well, so your selfies will look classier than they did before.
So, does the iPhone XS have a good camera or what?
The short answer is yes. The new iPhone camera is great. It focuses and shoots quickly, which is great for street photography. The extra exposures and the redesigned lens seem to translate into extremely sharp photos. And with the Smart HDR, you do get more details in your images, even if that does sometimes affect the image of an image that would otherwise benefit from some truly dark shadows.
But there’s also a learning curve. Portrait mode is really fun, but if you’re going to use it a lot, you’ll find that focusing and shooting in that mode is decidedly slower than taking a typical shot. The system has to find the subject, then figure out how to apply depth to the scene before it snaps the photo. Also, now that we’re introducing all of this blur—simulated or otherwise—into image, you’re likely to miss more frequently and get a blurry photo that didn’t have camera shake to blame.
And if you’re a camera purist, then you probably have some work to do getting used to this new era of computational photography. Google has been doing this same kind of work with its Pixel Visual Core and other smartphone cameras are adding even more camera modules so they can crunch more photographic data every time you push the button.
Last note
Whether you love or hate the look of the iPhone camera’s image, you’ll be glad to know that smartphone camera flashes are still hilarious. The iPhone XS Max does an OK job with its LED-based lighting solution, but flash photography is still well and truly a discipline for the dedicated camera photographers out there. Here are two pictures of a dumpster to remind you of that fact.