While I am referring to specific article/video by Macrumors.com, this is not a thesis on iPhone nor am I representing Apple or its viewpoints at all. I am a cinematographer and photographer with extensive knowledge of how cameras, lenses, and software works. This is about correcting misconceptions of how cameras work.
The video discusses how iPhone 11 Pro camera isn’t “that much better” than iPhone XS camera. Without going into a long book on how camera’s operate, let’s just dive into some basic.
Firstly, the author’s nomenclature is very much incorrect. To clarify some terms and some general understanding of how a camera functions, permit me to start with how the term “wide-angle” is used versus how it should be used.
“Wide-angle sensor” or “wide-angle camera”—these things do not exist. There is no such thing. A camera can have a wide-angle lens or a sensor of different size, but at no point will the camera or sensor be wide-angle.
A sensor, hence its name, is only a part of a camera that records a signal—in this case, light.
A camera is a system, a sum of parts, that outputs or captures an image. A rudimentary explanation of the process by which a camera capture an image is as follows:
Light is transmitted through the lens (whatever the focal length may be) through the aperture and is focused on the sensor that may include a color array, infrared filter, and more.
That signal is passed through a series of processes such as an analog to digital converter, signal to noise firmware, and, in the case of iPhone, the A13 Bionic chip, etc.
Passing on through the main board, the final processing takes place (even in RAW files).
Finally, it is transferred to storage.
While this is a very rudimentary list as it is more complex but the general concepts apply.
Cameras are not sensors, they are systems.
The size of a sensor also nothing to do with the lens. While sensors of different sizes will give different field of views, it would be inaccurate to ascribe the term “wide-angle sensor.” The focal length of a lens is the only item that truly affects the angle of view. Changing what is behind the lens does not change the laws of physics.
Most importantly, the phrase “larger sensor lets in more light” is patently false. Sensors record data. That is it. Under no circumstances does the size of the sensor address the light transmitted through the lens. What a larger sensor may change is signal to noise ratio, dynamic range, signal gain, highlight clipping point, field/angle of view, and low-light sensitivity. However, to be very clear, those are all highly generalized statements and are not necessarily true.
The lens, however, may allow more light to reach the sensor should the aperture be larger, fewer or thinner elements, and in the situation of iPhone 11, the new lenses do have a larger apertures. But no, larger sensors do not ‘let in more light’—they do not function like solar panels where one single signal is converted to another (electricity). Image sensors collect data on a pixel to pixel basis. A pixel on the right side of the sensor does not affect a pixel on the left side nor does the exposure change because of a larger sensor.
Image quality is not easily discussed because it is like saying, “lunch today was good,” there are many attributes that are taken into account concerning what makes a meal good. For example, flavor, price, quantity, preparation, etc. are all variables. The same goes for image quality. Some attributes may be: sharpness, dynamic range, color rendition, image fidelity, contrast, quantity of noise, detail, and much more.
However, the images discussed were of the claim that iPhone XS had better image quality than that of iPhone 11 Pro. The street image, which is like taking a picture of a brick wall—it isn’t a very good real-life example of how one uses a camera—is sophomoric. The high contrast of iPhone XS, while I do personally like, is that of having lower dynamic range. iPhone 11 Pro has better signal gain due to a new sensor, larger aperture, and different processing. Higher dynamic range will appear lower in contrast, however, it in fact gives more detail and more flexibility in editing an image. Had the author taken a moment to edit the photos, the image from iPhone 11 Pro would likely be superior. Discussing straight out of camera (SOOC) images is shortsighted and shows a lack of understand how people actually photograph.
He goes on to mention that the highlights are blown out on iPhone 11 Pro images. I don’t know what he is referring too. I see no blown highlights other than the sun in a few shots (which is to be expected). More importantly, the higher dynamic range will give the impression that highlights are blown when in fact they are very much fine. To be fair, we are discussing JPG images so there is less flexibility but dynamic range is dynamic range.
Below is an example of a photo where one might see the image as having clipped speculars but after being processed, they are not. Part of my Produce Portraits series, these are the same image but one (left) is SOOC while the other (right) is processed. Shot on Olympus E-M5 with M.Zuiko 60mm ƒ2.8 Macro lens at ƒ8, ISO 200, RAW.