According to reports, Apple has sold over a hundred million units of the iPhone 12 Pro and forty million units of the iPhone 13 Pro since their launch in September 2021. Learn about the explosive changes to the iPhone camera below.
Both models are highly popular among consumers and boast powerful cameras despite their tiny lens apertures. Until recently, phone cameras were only capable of producing basic digital point-and-shoot images, and people didn’t expect anything more.
However, with the latest iPhone models, Apple aims to make phone cameras perform as close to traditional cameras as possible, earning them the “Pro” distinction. The iPhone 13 Pro, for instance, features three separate lenses, takes twelve-megapixel images, and employs machine learning to optimize lighting and focus.
Yet, some users have reported unwanted visual glitches caused by the device’s intelligent photography features. For instance, Halide, a camera app developer, found that the 13 Pro occasionally erases objects in landscape shots due to the “complex, interwoven set of ‘smart’ software components that don’t fit together quite right.”
A significant portion of the population associates the term “smartphone” with “camera,” but iPhones are no longer traditional cameras. Instead, they are at the forefront of “computational photography,” a technique that involves digital data and processing as much as optical information to produce images.
The lens captures a photo, which is then altered to fit a pre-programmed ideal, making it closer to perfection. However, this comes with limitations. For instance, Gregory Gentert, a fine-art photographer in Brooklyn, says that the iPhone’s attempt to correct bluish light around dusk often leads to undesirable edits.
The hue is perceived as a flaw instead of a feature, and the device considers the subject a problem to solve. Furthermore, image processing eliminates digital noise, creating a soft blur, which can cause smudginess in photos. This fix ends up creating a distortion that’s more noticeable than the perceived mistake in the original image.

During the twentieth century, photography enabled the mass reproduction of art works, increasing their accessibility while diminishing their individual impact. Like physical “auras,” as described by Walter Benjamin, traditional cameras produced images with distinctive characteristics.
Consider the pristine Leica camera photo taken with a fixed-length lens or the Polaroid instant snapshot with its spotty exposure; these images were inextricably linked to the mechanics of the cameras themselves.
The iPhone, however, has made the camera infinitely reproducible. Its digital tools can mimic any camera, lens, or film at any moment, without the manual skill previously required, much like how early photographs replicated painters’ brushstrokes.
However, the resulting iPhone images create a shallow copy of photographic technique that undermines the impact of the original, straining toward the appearance of professionalism and mimicking artistry without ever truly achieving it. The average iPhone photo appears to be professional, but it does not necessarily mean the photos are good.
The iPhone 13 Pro includes a new Photographic Styles feature designed to let users in on the computational-photography process. While familiar editing tools work on a whole image after it is taken, Styles factors the adjustments into the stages of semantic analysis and selection between frames, much like adjusting settings on a manual camera.
The process changes how the photo will be taken when the shutter button is pushed. The Tone dial combines brightness, contrast, saturation, and other factors, while the Warmth dial changes the color temperature of the photos.
These adjustments have more subtle effects than the iPhone’s older post-processing filters, but the fundamental qualities of new-generation iPhone photographs remain. They are coldly crisp and vaguely inhuman, caught in the uncanny valley where creative expression meets machine learning.
Portrait Mode, one of the most dramatic features of Apple’s computational photography, imitates the way a lens with a wide aperture captures a subject in the foreground in sharp focus while blurring out the background. This effect is achieved not by the lens itself but by algorithmic filters that determine where the subject is and apply an artificial blur to the background.

Bokeh, as this gauzy quality is known, was once exclusive to glossy magazines and fashion photo shoots. Now, it is simply another aesthetic choice open to any user, and the digital simulation is often unconvincing. Users might notice the algorithm’s imperfections, such as fuzzy outlines of a subject’s hair or secondary figures registered as part of the background and blurred out altogether.
This machine-approximated version of bokeh signifies amateurism rather than craft, leading hobbyists who dislike such technological tricks to seek out older digital cameras or flee back to film.
However, the new iPhone cameras are reshaping the nature of image-making and our expectations of what a photograph should be, setting a standard of what the normal picture looks like. Paris-based photographer David Fitt hopes that in the future, clients will not ask for this type of look.