iPhone XS vs. XR: Are the camera differences worth $250?

iPhone XR has the same front-facing TrueDepth camera system as iPhone X, XS, and XS Max. That means it can do the same Portrait Selfies, including with Portrait Lighting, new Depth Control that lets you change the bokeh from f/1.4 to f/16, and the same Animoji, Memoji, and augmented reality effects, as all of those higher-priced phones.

Don't want to read? Watch the video version above and subscribe for more!

And, by the way, Apple says it's heard our complaints — no, not the wacky internet hashtag smooth gate conspiracy theories that were being foisted around — but the legit complaints I talked about previously:

Apple, for its part, really, truly, deeply believes the new imaging pipeline is better than the previous one and better than what anyone else is doing today. If you disagree — and when it comes to the selfie results, I personally disagree hard — or soft, or smooth, or whatever — it's important to let Apple know. A lot. Because pipelines can and will be tweaked, updated, and improved over time.And, like I said, if they can detect and preserve fabric, rope, cloud, and other textures, why not skin texture as well?

Apple identified a bug, not just with faces but with everything it was processing, and will be fixing it in iOS 12.1 by making sure it picks the sharpest frame possible to compute from, and we'll all be getting better, crisper selfies going forward.

What's different is the camera on the back. Where iPhone X, XS, and XS Max had dual systems with wide angle and telephoto lenses, and fused them together to for 2x optical zoom and rear-facing Portrait mode, iPhone XR has just the wide angle.

Eyes Wide Open

iPhone XR has the same, improved wide angle as on the XS and XS Max, complete with a 30% bigger sensor, with bigger and deeper pixels to drink in more light and more accurately preserve colors. And it has the same, new Smart HDR feature that ties the image signal processor to the 8-core neural engine, buffers up to 4 frames ahead, shoots a series of exposures, interleaves them a series of over exposures to get details from the highlights, and tops it off with a long exposure to pull similar details from the shadows.

iPhone XR in red

iPhone XR in red (Image credit: iMore)

You can turn it off in Settings, or keep both the Smart HDR and original version if you like, but it all adds up to very similar experience for a camera phone that costs only 3/4 as much.

Where iPhone XR really differs from XS and XS Max is in the rear Portrait Mode. Absent a second, telephoto camera to leverage for more, real depth data, Apple is doing with iPhone XR something similar to what Google did with Pixel 2 last year — using the parallax pulled off the phase detect autofocus system, or what Apple calls Focus Pixels — to get some depth data, and then applying a machine learning powered segmentation mask to separate the subject from the background.

Portraits Mode

Now, as Apple was announcing this on stage, I was worried. I know a lot of people love, love, love Google's Portrait Mode, but as someone who's owned a Pixel 2 XL for a year now, I've had some issues with it.

Some were minor, like the slightly cooler color cast it looks like Google fixed with Pixel 3. Others were bigger, like the card-board cut out effect segmentation masks can produce and, frankly, I see a little bit on the XR as well. One was a deal-breaker, and kept me using the Pixel for regular photography but not for Portrait Mode: It's inability to show the effect live, in the preview.

I explained why in my review but I'll quickly repeat it again:

The Pixel's Portrait Mode isn't really Portrait Mode because it doesn't show the actual effect live in the preview, it only applies it after a few, long seconds as a post-production filter. Something you could do with any app. Heck, something Google could release as an app for other camera phones.

https://www.instagram.com/p/BbzYB5YFyvM/?taken-by=reneritchie

Many would argue that none of this matters, just the end result. I say bullshit. At least for me, because I'm used to shooting with a DSLR, I'm used to framing for the actual shot I'm getting. If I don't like the depth of field in the preview, I can move a little and get it just how I want it before taking the shot. With Pixel, I had to take the shot, go check it, wait for it to apply, and then, if I didn't like it, shoot it all over again.

And all of that was a nasty surprise when I got the phone because everything is terrible and almost no on one covered it in their reviews.

What I think it comes down to is that Google seems to be using separate pipelines for the Pixel's live preview and actual camera capture where Apple has gone to great engineering and silicon pains to make sure that, like a DSLR, what you see is what you shoot. Even more so because the iPhone display is so much better and more accurate than any DSLR.

Doesn't sound like Pixel 3 solves this so, while I'm getting one, I'll likely be sticking to non-Portrait mode photos with it. Yell at me in the comments all you want Google nerds, I heart you anyway, I just skew far more towards optical nerdy.

So, long tangent short, I was worried Apple would end up doing the same on iPhone XR. But, turns out, not so much. Whether it's the power of the A12 Bionic or just the result of different design trade-offs, Apple has managed to push the depth effect into the live view on iPhone XR so, blessedly, what you see is what you should.

Follow the money... er... demo

Apple showed off the results on stage and in its demo picks. I know you can't always trust demo picks. They tend to be the cherry-picked, idealized, best-of-the-best examples.

Apple has a good reputation here though. They don't cheat and claim DSLR photos are shot with their phone, or bring special lighting gear with them that an average customer wouldn't have access to. And they also don't hire professional photographers to go on tour, or make huge publicity buys with massive magazine media companies, including cover shots.

A lot of famous photographers use iPhones, and plenty of magazines have shot covers and features on iPhones, but as far as I can tell, Apple hasn't ever paid for carry or placements.

And the demo shots are usually quickly backed up by "shot on iPhone" shots, which is something Apple latched onto early: People started hash tagging their photos on Flickr and Instagram, and Apple noticed, became enamored, and quickly got behind it and started amplifying it. Which was smart: the best campaigns are often the ones your customers come up with.

Second long tangent, short: I thought I had an idea what the XR could do with its new Portrait Mode. But, no. Shooting with it for the last week has been one surprise after another. Some good. Some not so good. All of it educational.

The major difference is this:

With all previous Portraits Mode, from iPhone 7 Plus to iPhone XS and XS Max, you were shooting with the effectively 52mm telephoto lens. With iPhone XR, you're shooting with the effectively 26mm wide angle lens. Switching from one to the other is like swapping glass on a traditional camera.

That's especially true because, instead of just slapping on a custom gaussian or disc blur over the background and calling it a day, which is what Apple used to do and, I think, pretty much every other camera phone maker still does, this year Apple examined a bunch of high-end cameras and lenses and created a virtual model for both the iPhone XS and iPhone XR.

That means, it ingests the scene with computer vision, makes sense of everything it sees, and then renders the bokeh, including lights, overlapping lights, and the kind of distortions real glass physics produces in the real world.

And, when you slide the new Depth Control back and forth between f/1.4 and f.16, it re-calculates and re-renders the virtual lens model.

The result is the same kind of character and, yeah, personality you get with real-world lenses. And that means shooting with iPhone XS vs. iPhone XR gives you photos with different character and, yeah, personality.

There are also some huge pros and cons to get used to.

The good, the bad, and the bug'ly

Apple's wide lens is, of course, wide. So, if you want a face to fill the frame, you'll have to sneaker zoom in instead of out. That you can move in and out so much Is terrific, though. You're not bound by the same sweet spot that you are with the dual camera Portrait Mode system that often seems to be telling you to move closer or move further away.

And that means you can get a lot closer or a lot further away with the XR than you can the XS, and still trigger the depth effect.

And because the XR is using the f/1.8 aperture wide angle for portrait mode, and not the f/2.4 telephoto like dual camera iPhones, it can pull in more light and compute its version of the depth affect in much darker conditions than iPhone XS or previous.

But only for human faces. Which is where XR might experience it's own deal-breaker, at least for some people.

And, unlike iPhone 7 Plus when it first shipped, where Portrait mode was optimized for human faces but would do its best with everything else — and now with iOS 12 does amazingly well on an incredibly diverse set of subjects and objects — iPhone XR literally won't engage Portrait Mode if it can't detect a human face.

Now, like I said in my review, it's pretty great at engaging when it does. It uses a Face ID-like neural network to not only identify human faces but identify them even if they're partially obscured by glasses, hats, scarves, and other forms of clothing. Apple trained and tested it on an incredibly diverse and varied pool of people and things that people usually have on their heads and faces.

But that does mean no coffee or cocktail cups in deep blur, no pets in depth effect, and it can even lose track of human faces if they turn too far past profile.

Like I also said in my review, the f/1.8 comera has gotten good enough that you can get a lot of real depth of field by picking your shots. But, if you want the computational stuff and you want it for everything, you'll have to move up to an iPhone XS or Max or even iPhone 8 or iPhone 7 Plus to get it.

The computational era

I'm going to say this again, because it bears repeating again and again: As good as DSLRs and micro 4/3rds have become, and I shoot the sit-down scenes and some of the b-roll for this very show on Panasonic and Canon, we now live in an age of computational photography. Of bits that can go far beyond the atoms.

Theoretically, those bits — those computational cameras — have no limits. That can reproduce the world in a way no bound-by-physics glass ever could. It could end up looking more real than real. Scientific and sterile or just uncanny and unnatural.

By imposing some of the constraints of real world physics and lenses to computational models, not only does the wrong we've gotten used to look right, the limits add character and drive creativity.

And, physical or computational, that's what you want from a great camera.

So, is the difference in cameras worth $250+?

I've been shooting with Apple's dual camera system for a few years now and the new single camera system for just around a week. So, obviously, I want to shoot a lot more to get a better handle on it.

But, I think it's abundantly clear already that, if you absolutely don't care about the telephoto lens, including 2x optical zoom, real depth data for more diverse rear portrait mode photos, and the framing you get from all of that, then you can indeed save yourself $250+ by going with iPhone XR instead.

Personally, I do want the double camera system, so I'll be sticking with XS, but let me know which way you're leaning — or if you've already made your choice.

○ Video: YouTube
○ Podcast: Apple | Overcast | Pocket Casts | RSS
○ Column: iMore | RSS
○ Social: Twitter | Instagram

Rene Ritchie
Contributor

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.