Every photo in this post was taken with Halide’s Process Zero on an iPhone 13.
Knowing how much work Lux Optics puts into their apps, Halide and Kino, I don’t think their recent Process Zero was implemented as a reaction to the ongoing backlash against “AI”. After all, now that people are increasingly negative about generative models, releasing a new photography mode that bypasses “AI” processing feels like a clever marketing stunt.
But it’s much more than a stunt.
The people at Lux put a lot of thought into photography and are enthusiasts themselves. They are keenly aware of the value of the Machine Learning features that Apple has implemented in their iPhones. But they’ve also been noting for a long while that Apple’s heavy-handed processing is a hindrance to photography enthusiasts starting out on the iPhone.
From the iPhone 15 Pro Max review written by Lux Optic’s Sebastiaan de With almost a year ago:
One of the best things I have done to educate myself as a photography instructor and camera app developer is joining a Facebook group of novice iPhone photographers. In the last few years, I’ve watch many novices run into the same issue: distracting processing on the latest and greatest iPhones.
People in photos are blown out and seemingly over-processed, and telephoto shots reduced to a sloppy, smudgy mush. Some argue that Apple has gone too far in processing, to the extent that it makes images unrecognizable. Many Halide users reduce processing in Capture Settings, or just shoot native RAW to circumvent processing altogether.
But what’s really happening is that the iPhone has gotten better as rescuing photos. Back in the days of the iPhone 7, if I shot my subject backlit by the sun, the result would be unusable. At best, you’d see a silhouette. The iPhone can now take those once-unsalvageable and make them passable. Sometimes even good.
The tradeoff is that today’s novice photographers have lost the feedback loop we had back when we learned photography on big, dumb cameras. When a camera never fails, you can be fooled into thinking you’ve mastered photography.
A solid and consistent feedback loop is essential for skills development. For practice to be meaningful, any time you take the same action in the same context, it needs to have similar results, bad or good. Our minds are analogue processes and it’s integral to how we learn. The more “analogues” to an action we have practised, the better our ingrained understanding of that action and the processes surrounding it becomes and the less conscious we become about the act itself.
If the camera always returns a decent photo, no matter the context, whatever you do, your skills will never improve. You need the failures.
But Apple’s camera app rescues every photo. It tweaks subpar photos into tourist site postcard pics. It turns under-lit mud into passable mediocrity. Whatever you do, it’ll deliver similar results. The action and the results are completely disconnected. Your photography will never improve if all you use are the camera apps that come with your phone.
The innovation of process zero isn’t that it improves the image – the final image is arguably worse in multiple ways – but that it establishes a consistent feedback loop in your photography. It turns the iPhone camera into a creative tool.
This is why randomly and haphazardly testing the feature will give you worse results than the built-in camera app. Process Zero files are going to be noisier and have less dynamic range than the images captured by Apple’s app. They might be marginally sharper, but if your conception of what makes good photography is the absence of noise in the signal and a broad range of values captured in the data, you’re going to hate this feature and are unlikely to ever understand the appeal.
Another issue people run into is that you often get worse when you first begin to practice a skill thoughtfully. Your unconscious habits that were holding you in a rut are no longer helping you and are instead getting in the way. There is more work involved and your skills lag your taste, which is both used to good-enough automatic results and the output of those already practised. Your first step in practice is usually a step backwards. That’s normal.
What these files have is a consistent quality that correlates with the actions you took as a photography and that consistency is what will help you take the first step forward. The files respond to post-processing in predictable ways, meaning you will develop a gut sense for how a picture can be modified while you’re taking it, which is impossible with regular ML-processed files.
If you want to just capture the moment without worrying about the photography process itself, use the built-in app. That’s what it’s for. Not everybody needs or wants to be a photography enthusiast.
This also demonstrates how much of a smart move this was on the part of Lux Optics. No matter what Apple adds to their camera app, they’re unlikely to let you turn off all of the ML-processing. It gives Halide clear differentiation and positioning.
It’s also fun. It put the iPhone back into proper rotation for my photography walks, where before I’d mostly rely on it if I’d forgotten one of my other cameras or if the weather was bad. (Having a water-resistant camera in your pocket is really useful.)
Like this final picture. I took it on a walk where I had two other cameras with me, both of which I tried to use to get this particular shot, but I prefer the angle and field of view created by the iPhone’s lens.
It’s a fun camera.