iPhone photography in the RAW

Rene: Joining me today, we have the man who literally put the stitched leather into iOS, former Apple designer, former doubleTwist designer, long-time iconographer, and right now working on an app called Halide. How are you, Sebastiaan?

Sebastiaan: Joining me today, we have the man who literally put the stitched leather into iOS, former Apple designer, former doubleTwist designer, long-time iconographer, and right now working on an app called Halide. How are you, Sebastiaan? I'm great, Rene. How are you doing?

Rene: Very well, thank you. You did a series of posts about shooting RAW on iPhone that were just breathtaking, eye-popping. I don't know all the superlatives I'm supposed to use for it, but it was really impressive.

I thought it'd be great to sit down and talk to you, because people hear RAW, and they get really excited, but then they might think about trying it and get a little bit intimidated. I wanted to demystify it and figure out where people can start making much better photos.

Sebastiaan: Totally. Of course, the iPhone already has an amazing camera. A lot of people actually start with the thought, "Why would I even want to change from the stock camera app?" The reality of it is RAW is a way to get a lot more freedom when you're editing your photo.

If you're already doing stuff like editing your photos in Instagram, or maybe using apps like Lightroom, or Darkroom, or VSCO, using RAW is a really good way to give yourself a bit more freedom to get more of your shot.

Getting RAW with your iPhone photography

Rene: What is RAW? If someone has only heard the term, but they don't know it is, what is a RAW file?

Sebastiaan: When a camera takes a photo, it does a lot of things. It takes in all this light, and then there's a tiny, little square behind the camera lens on your phone that turns that light into a signal. Usually quickly, your phone takes all that data, does a bunch of magic to it, and creates a processed image.

It applies some magic to make it look as good as possible, and creates a final image. A RAW file is basically taking all that information that it captured, and not doing any of the processing, so you just get exactly what the camera saw right out of the camera.

It might not look as good, but you'll get a little bit more information, and given some massaging you might be able to get something special out of it.

Rene: You enable that in your app, in Halide. It's enabled in a few different iOS apps. What is shooting RAW like on an iPhone? Is it shooting RAW like you think about on a DSLR or on a traditional camera?

Sebastiaan: It's mostly the same, but there's a few key differences. What's really different on your iPhone is that you will immediately notice that you lose a few things, which I refer to as Apple secret sauce. A camera does a couple of really special things. They've got super smart people at Apple working on this stuff.

Things like noise reduction. One little thing which most people don't know is when you take a photo on your iPhone, it will take like 20 of them, find the sharpest one, automatically select it, so you see the sharpest possible image.

A RAW photo won't do that. You'll just get one shot. It'll probably look a little noisier than it would on this normal camera app.

Rene: To dive deeper, Apple has been doing the A series chips for a while. Part of the A series chips is an image signal processor, an ISP. That does all these things like auto light balance, autofocus, face detection.

It removes noise, but then it detects textures like fabrics, and tries to make the compression not interfere with the patterns. It detects clouds, and snow, and it does an incredible amount of work. Like you said, it takes a bunch of different photos, and takes different pieces and tries to make that photo HDR, and all these different things, to give you...

The idea is that you can take the phone out of your pocket and shoot a photo, would be the best possible photo given those circumstances. There are some people, you're certainly an example. When I shoot on a DSLR, I'm an example, you want maximum control, not maximum convenience.

Sebastiaan: Certainly. Raw on iPhone, we have to say, is a fairly new thing. As developers, we only got this maybe a little over a year ago. We got full access to do stuff like this, get all the data from the sensor and have manual settings and such. We'll get to that in a bit.

Maybe in the future, it'll be more like on a pro-level camera like a normal camera, where the RAW file will have a bunch of magic applied to it, so that like on SLRs, as you have probably noticed, noise reduction is applied to the RAW file, so you get a pretty noise-free RAW file, pretty comparable to the normal JPEG file you get out of the camera.

On the iPhone, it's not the case yet. I suspect part of it is because it's just so new. It's a fairly new feature to have on the iPhone.

Rene: People usually with Nokia phones, some Android phones, are laughing, like, "Ha, ha, ha. You invented in 2017 or 2016, whenever it was." Apple deliberately tried to keep the built-in camera app super simple so that people could just use it out of the box and not worry about it.

With Halide, you made it feel tactile. The only example I can give is like Leica. You made it feel like there are dials that I can twist and turn.

Sebastiaan: Thanks. That's a great compliment. We do try to be as close to a real camera as possible.

Designing Halide for RAW and Manual

Rene: When you were working on Halide, what were your design goals in terms of RAW and manual? How did you want to expose settings but not introduce a lot of complexity?

Sebastiaan: You just described the mission we had when we set off to build it. My friend Ben and I, we built it together. We're both just two people building it. We saw some professional cameras through the iPhone out there, but they had a ton of bells and whistles. Really overwhelming number of controls.

Then there's the iPhone camera, which we can never beat. The stock camera app is incredibly good. Some of the greatest talent at Apple is working on it, but there's already so much in it, like time-lapse mode, and video recording, all that kind of stuff. Obviously, it probably won't add much more complexity to it.

We wanted to strike the middle ground in that. The way we did that is by looking at existing camera and mimicking the UI you see on that. If you hold a camera, if you give a child a camera, the first thing they'll do is click the buttons, and twist the wheels and stuff.

There's a very intuitive thing to how tactile it is. It's very satisfying. To try to replicate that on the iPhone, we decided we needed canonical gestures. The gesture of choice on the stock camera app is tapping. It's a one-dimensional input method.

You say, "This is a point of interest." You try to figure out from your tap, should it be in focus, should it be exposed, both? The very common frustration point on that is now your flower is in focus, but the whole background is too bright.

HDR tries to take that away, but it can't perfectly do it. Maybe you just want to tell your camera, "I just want the photo to be in focus. The exposure is fine. Don't touch that." One-dimensional app. We went two dimensions.

You can swipe up and down on the viewfinder anywhere on the screen, whether you're a lefty or a righty. It gets the exposure up and down, to make it brighter or darker, which happens to mirror the way a camera works. You usually have an exposure wheel that you also twist that way.

For focus, you can swipe left and right. Normally, if autofocus is enabled, but even if you swipe left on autofocus, you don't have to tap anything, it will go right into manually focusing, and if you swiped it left, you'll focus on something a little closer to you.

If you swiped it right, far away, which is the same way a lens adjusts. If you're holding a camera like this, left and right, left is closer, right is farther.

That was the initial setup. Then within that, we added a few toggleable features like, say, if you want to run RAW on or RAW off, or you want to control white balance, that's all behind, an extra tap, but basically the same level of adjustment.

Rene: It's interesting because you went from the photo-realism of the original iPhone app which had the shutter animation. It did have a bunch of real chunky, photo-realistic elements. Then we went to the Jony era of iOS 7, which became much flatter, and much starker, and rendered flat for digital screens.

You have the playfulness in the interactivity. Not necessarily in the rich textures, but in the way that you interact with the app.

Sebastiaan: Totally. Like you mentioned, the old camera app was literally metal. It was actually modeled after a real camera. I always loved the old camera icon also was a camera lens, and it was sitting right over where the camera was behind the screen. That was a cool touch.

We are in the era right now where a lot of the playfulness in apps comes out of interactivity. Apart from the fact that we made the shutter button wink, it's a little tactile. It's got a bit of a gradient on it. I couldn't resist it. I'm still me.

Everything else is in the gestures, and how things interact. We went to the next level with the iPhone X, where we were like maybe we can make it really pleasant to use with one hand, because it's such a tall device to hold. Maybe we can make sure that everything is under your thumb, right here.

Rene: I have it up on a little tripod here, and it's narrow so that you have the buttons, but you use the horns too. You went in on the horns, and you have the Instagram on one side, and on the other, you have the speed, right?

Sebastiaan: Yeah. You can see the exposure compensation, or your shutter speed there.

Rene: I love that. A lot of people complain about the horns, and they thought that the camera was taking away screen. Most phones just go completely horizontal. Apple exposed this Apple Watch-style complication area. Some apps, especially yours, are doing a good job utilizing those.

Sebastiaan: It's cool to see more and more apps taking advantage of that. There was a bit of uncertainty at first too, because I saw a lot of people comment, even like the iMore comment section . They were like, "Hey, is that allowed by Apple?" We were like, "We got featured in the App Store, so I think they're OK with it."

There was a bit of confusion, like what can you do with it? Is that kosher? Apple has been really good about encouraging people to be a little innovative with it.

The iPhone RAW workflow

Rene: What does your RAW workflow look like? When you are out, you want to shoot something, and you know that you want to make a real Instagram-worthy or studio piece, what is your workflow like?

Sebastiaan: Usually, because we can't override the really nice gesture on the phone, where you swipe left into the camera, I use little on-screen widget, probably not visible right now, but you can swipe left, add it to your today view. You get the one tap into it with that.

I use Halide, and then I usually have it already set up to automatically capture RAW. I usually push the exposure down a little bit, because it's better to make sure that you have every bright area well exposed. You can always bring the shadows up a little bit later, and I snap the shot.

Then depending on how close I am to a computer, I import it into Darkroom, which is this wonderful app. It's actually free if you don't want to buy all the pro features. You can wrap that make it on presets.

I have a couple of presets set up that I apply to see which one looks best, then I put it on Instagram. If I'm near my Mac, I just airdrop it over and then plug into Lightroom, and then I edit from there.

Rene: Is it based on feeling, or are there certain...You know you're always going to crash the blacks, or you're always going to boost the saturation. Are there a set thing you do, or do you play it by ear?

Sebastiaan: I wrote a little bit about this in my second article about editing RAW, but for me, for a lot of people, it's a creative choice. A good thing to keep in mind is trying to edit it in a way that makes it more representative of what you feel like you saw when you were there.

Phones -- all cameras, actually -- are legendarily bad at sunsets. The reason they're so bad at them is because the light, it's colored. It can go into how white balance works, and having a little spill, but the gist of it is once light starts being a particular color, your camera just doesn't understand what's white anymore, because white surfaces will reflect the colored light.

That's why your sunset photo never looks like what you see, because your eyes are really good at figuring out white balance. I know in my mind what I saw. Poets have written beautiful things about this. I'm not a poet. I'm not going to do it.

You probably thought, "Oh, it's a beautiful..." It's like a peach. It's a beautiful violet, going to orange, going to yellow. Try to bring that out. Take your photo, especially a RAW photo, and see if you can bring out those colors.

You can boost the saturation a little bit. Most apps nowadays have selective color adjustment. You can say, "I think I saw the purples being a little stronger." The reds are brought up a little bit. It was a little warmer than the camera saw it.

Before you know it, you'll actually have a photo that looks really good, simply because you already knew how it looked good, because your eyes saw it.

Samsung vs. Apple photo philosophies

Rene: One of the things that I find super interesting is that different companies have different philosophies. Apple, which is famous for wanting to make all the decisions for you. Photographs are one of the things where they don't.

They try to reproduce as naturally as possible what comes off the sensor, where Samsung, which has a lot of manual controls, like duplicate versions of every app. They go for making it look good for you, and they do crash the blacks, and they do boost the sat.

I find it super interesting that's the one area where Apple wants to not get up in your business.

Sebastiaan: It's interesting that's their philosophy with displays and cameras. You also see this with the Samsung OLED screens. They're really punchy, typically on the oversaturated side. People see their strength. They're really colorful screens.

Apple went all in on the OLED with the iPhone X. They probably went through crazy, painstaking amounts of work to make sure that they're color accurate, which may not be what the average person even appreciates. If anyone is serious about photography or that kind of stuff, it's really, really nice.

Rene: It gets to this insecurity that I have. It's my photography impostor syndrome, that when I do crash blacks or boost saturation a little bit, I'm worried that I'm trying to make it look like a BestBuy television set. I'm just classless, and I want to make the colors brighter.

Sebastiaan: It really kills me too when you do these adjustments, and there's one person who is like, "That's a little too much," and like, "Oh, my God. I made it look cheap and bad." That's off.

Rene: It's a watermark photograph now.

Sebastiaan: Yeah, exactly. [laughs]

Getting "Depthy" with it

Rene: You also support depth. There's a depth API in iOS 11, and you support depth, but you have a really interesting way of handling, almost like a hinting feature.

Sebastiaan: Yeah. We started referring to this as depth picking, because there is such a thing as focus picking, where your camera can point out what's in focus. We are pretty close to releasing a new version of Halide which is entirely focused on depth capture. Which is pretty crazy. There's a couple of really crazy features in it.

Depth is a really new thing. There's very few cameras out there that have two cameras working together to capture 3D data. Your iPhone, out of the box, does it with portrait mode. It can see the difference between these images, see the depth, and that's how it applies the blur behind people.

There's more applications in that. In fact, they had a lot of people from visual effects in the street contact us saying, "Hey, how can we get the depth data out of a photo so that I can play with it and make something crazy?" We said we don't have a way right now. It's really cool to hear that people want to do something like that.

The last month, two months, we've been working on really trying to make best camera also for depth.

iPhone X vs. Pixel 2 in Portrait Mode

Rene: This is where I get every Android or Google aficionado angry with me, but there was that line at the Google hardware event when they announced the Pixel 2, where they said that we don't need two cameras to get depth effect.

Google is super clever. They deserve a lot of credit. They take the focus pixels on their phase adjust focus system, and they use that to get a couple layers of parallax, but they're really, really good at segmentation masking.

They can figure out algorithmically what is a head, pull that out, and then apply a blur to everything else, but there's not a lot of depth data. There's zero, I believe, on the front camera, where Apple, just on the iPhone X...

On the iPhone 7, they were getting...I think it was nine layers of depth data. I don't even know what they're getting on the iPhone X, but they're getting multiple layers of depth data. The front camera too is getting I think probably even better, more granular depth data.

That means that for applications like yours, if you don't have access to the Google algorithm, you still have all that RAW data. Like you said, special effects people, they might not care about doing a depth blur. They even have totally other things in mind for those.

Sebastiaan: Totally. The other key benefit is -- this is what people with new Pixels are really astonished by when they move there from an iPhone -- you can't do any real-time stuff.

The iPhone camera can show you how a portrait photo is going to look, but with the Pixel, you just have to wait until it's done processing, and then it'll spit out an image, and yeah, that's a pretty decent background blur.

Rene: People get angry at me for these two, because actually, there's a video showing this. I find this disturbing, because I'm used to shooting with traditional cameras where, if I have a fast 50 on there, I see what the depth effect looks like. I see what's in focus and what's not in focus. That informs my decision of how I want to take the photo.

On the iPhone, because no blur effect is perfect, if I see aberrations around someone's glasses, or their nose, or something, I can turn a few degrees so I can get the best photo I can, where on my Pixel 2 XL, I take the photo, I look, "Ah, it's wrong." I go back, take the photo again, "Ah, still no."

I love that I can pull it into Photoshop and fuss with it, but I don't want to have to do that.

Sebastiaan: That's really frustrating. That really deserves a lot of credit. There's an incredible amount of power required to do that, and then these buckets of smarts from Apple on doing that. It's really, really impressive that they pull this off.

Rene: The old joke for me was that Nokia put really big glass on their cameras. Apple had really amazing image signal processors, and Google had no idea what the hardware was that you were shooting with. They would just suck everything up into the cloud and auto-awesome everything .

It's interesting that Apple is stuck with solid hardware and increased their processing, so now they can do things. They have all that hardware, and just a matter of what your software can do, where I think the Pixel 2 is bound by what the software can do. You can't just add a better camera to it.

Sebastiaan: It's true. I think all the companies are running into the physical limits of what we can do with the small size of the camera. I don't think we'll see huge leaps and bounds in camera quality that we will see.

We'll probably see much better photos coming out of our phones, but increasingly, that will all be about the software. Apple has been really investing in that, which is really smart.

Beyond Portrait Mode

Rene: The portrait mode, it's used for depth effect, but also, there are some filters that do a very nice job in adjusting how the filter is applied based on depth. There is the portrait lighting. It's still in beta, but it does different sorts of lighting, and even do the stage lighting effect.

I was joking that next, we would see portrait green screen, or portrait backgrounds. They ship that in the Clips app. It's interesting that they're knocking down things that you would traditionally need, like a big camera, or a big glass, studio lights, an actual physical green screen to do all this stuff with just atoms and bits.

Sebastiaan: Of course, now they're using different technologies as well to get depth data. You have your iPhone X front-facing camera has the TrueDepth camera. That has an astonishingly good idea of depth.

It's a little bit more shallow, so this is the range of the depth that it captures, but your face will be extremely detailed in depth, whereas the dual camera is more about a couple of feet away from you to how much father that is.

It's a trojan horse. This kind of technology will enable AR, for instance, to be far, far better, because it can sense the geometry of your room so much easier than, say, a Pixel.

Rene: The AR part is also the trojan horse, because people look and go, "Ah, who cares there's a castle floating on my sofa?" The AR is also about the ingestion. It's about understanding what's a flat surface, what's a wall. It's all this information.

Machines were dumb. They were dumb cameras. Oh, look, there's a photo. Now they can understand what actual physical objects are in those photos, and doing things like optimizing for surfaces, and for room geometry, and for light balance. That's going to help us go to yet another level.

Sebastiaan: Yeah, it's true. It's super impressive stuff. What is a phone now if not a collection of sensors?

iPhone vs. SLR

Rene: So true. What do you find yourself shooting with iPhone, and shooting with your Leica, for example? Leica, pronouncing everything wrong.

Sebastiaan: There's still the big physical limit when it comes to shooting in the dark. Phones just don't handle that well. You've got a very small lens, a very small sensor. Not perfect yet.

I've barely taken out my real cameras lately, simply because A, I want to really make sure that I use my product a lot and B, because increasingly the phone was that good. I don't really need it on a sunny day any more.

I went sailing yesterday on the San Francisco Bay, which was horrifying because we had 35-miles-an-hour winds, and I thought I was going to die. I thought, "All this for a photo? All this for Russia? What will Rene think?"

I didn't take my camera out, because I was like, "I'm just going to adopt for it." I've been increasingly happier with that.

When we just got started, we had the 7 Plus, and I was like, "Hmm, I'll take my camera along as well." Increasingly, it's only for really specific things. If you want to do some good portrait photos, portrait mode is fantastic, but it can't beat a good portrait lens yet, especially when it comes to really shallow depth of field.

When it comes to shooting at night, it's really nice to have a big sensor and a big camera for it. Apart from that...

Rene: Beyond sneaker zoom, you have 2X now, which is optical. I find still, if you really want zoom, you still need to get one of the...If you want to shoot a soccer game, or an event, I still need to have that big glass to do that.

Sebastiaan: Wildlife. You can walk up to a bird sometimes, but a zoom lens makes it a little easier.

What's next for iPhone photo apps?

Rene: They're cagey. We're entering the new year. WWDC is coming up in June. There'll obviously be a new iPhone in September. Is there anything you'd like to see next from Apple, either software, hardware, mix of both?

Sebastiaan: We always want to have more control over the camera. It's really hard for developers to be like what Apple's doing. Obviously, their engineers get incredible access to the camera. They do all sorts of magic things. We'd love to see more of that.

Overall, I would want to see more double cameras everywhere. I want everything to have two cameras now. It applies so many cool things. We're just scratching the surface, like you said. The kind of sensory data we're getting out of it.

Give them all TrueDepth cameras. Give the iPad double cameras. Give everything double cameras. It would really, really help with so many cool applications.

Rene: Could you do double TrueDepth cameras, or would that be a waste?

Sebastiaan: Super overkill. I don't know. It would be cool.

Rene: More data. More data is never bad.

Sebastiaan: More data is never bad. No, the notch will be twice as big. Imagine that. That's going to be quite a common storm on the Internet, Rene.

Rene: Yeah. It will be a notch on the back, a notch on top, a notch on the left, notches everywhere.

Sebastian, if people want to find out more about you, more about Halide, where can they go?

Sebastiaan: We have a website at halide.cam, C-A-M. I'm on Twitter, @sdw. My partner, Ben, who builds Halide, Twitters @sandofsky. That's about it, I'd say.

Rene: Check all the Instagrams.

Sebastiaan: Yeah. Feel free to check our Instagrams. We do feature photos from Halide users, which is super cool. We've seen it used in over 120 countries now, which is absolutely crazy.

Rene: Should they hashtag Halide, do they @Halide? What's the best way to...?

Sebastiaan: If you take a Halide photo, tag it with #ShotWithHalide, and we'll totally feature your shot.

Rene: Thank you, Sebastian. Thank you so much.

Sebastiaan: Thanks, Rene.

See Halide on the App Store

○ Video: YouTube
○ Podcast: Apple | Overcast | Pocket Casts | RSS
○ Column: iMore | RSS
○ Social: Twitter | Instagram

Rene Ritchie
Contributor

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.