We get to test a lot of premium phones, and every single one of them claims to have perfected the art of mobile photography. It’s true that there is a lot of innovation in this space, but also a lot of marketing tactics (you know, like “more megapixels = better”). That’s why it’s quite difficult from a user’s perspective to figure out if a camera is really good before purchasing a phone. Let’s define what a “great phone experience” is, and what features you should look for.
Fortunately, there are excellent technologies to address these problems: higher megapixel count is a good thing in good lighting conditions, because there’s plenty of light to work from, and you increase the sharpness of the photo. That’s particularly true for “nature” scenes that have a lot of details (grass, foliage, etc…).
I know… megapixel doesn’t always mean “quality”, but with a bright scene, it can make a perceptible difference. Of course, the sensor type (backside-illuminated or not) sensor size, sensor pixel size and lens aperture also contribute to how much light the sensor can pick-up. The more light, the more data and the better the opportunity to have a great picture. The opportunity will be realized (or not) by software processing.
Optical Image Stabilization (OIS) and Digital Image Stabilization (DIS) are two different facets of the larger Image Stabilization (IS) problem that occur because there’s an inherent (tiny) shakiness when we hold devices to take photos. IS doesn’t matter much when the exposure time is super-fast (in broad daylight), but as the light gets more scarce, exposure time has to increase, and camera motion leads to a blurry picture. Not good, and that’s why phone-camera reviews should not use tripods for tests. That’s now how people use their phones.
Optical Image Stabilization means that the physical lens motion is compensated by a mechanism (motor, magnet, fluid encasing). OIS is good for very small camera motion such as handshaking. OIS does not help with large motion changes such as walking.
Digital Image Stabilization can help with larger motions, such as walking, or the shake from riding a bike on bumpy pavement. To learn more, read our overview on Image Stabilization, but in general, stabilization helps to have a sharp image and non-shaky movies. Electronic Image Stabilization is a variant of DIS. Both DIS and WIS do not help with low-light capture, because they cannot be used to increase the exposure time.
Because most high-end smartphones do very well in good lighting conditions, Low Light Photography is usually a primary concern to users. That is typically the most difficult condition for the whole digital photography pipeline from sensor, to image processor (ISP), to software image tuning and enhancement (de-noising…).
Bokeh or Depth-of-Field is the blur that happens in the out-of-focus areas. In general, the larger the lens and the aperture, and the higher the blur. The problem is that mobile phones are tiny, and although there is still some blurring going on, the quality of it is not as nice as what you get with a bigger camera. But there’s a way to compensate for it.
By taking two images consecutively with a tiny camera motion separating them (from hand-shaking), it’s possible to compute an estimation of the distance between the lens and each pixel in the photo (depth map). How far the point of focus is, it’s possible to artificially increase the blur using an algorithm.
Ufocus in action: from top to bottom, the normal photo and two photos with a bokeh effect. The green circle shows where I set the virtual focus point
Cameras that have dual-lenses have a much better chance of getting a nice Bokeh than single-lens cameras because the image separation is large (10mm vs. ~1mm or less) and constant. The Huawei P9 would be a good example of such cameras. The HTC One M8 also featured a similar concept. The LG G5 has two cameras but doesn’t support this 2D-3D reconstruction because the second lens is only used for wide angle shots.
Related: read our LG G5 review, HTC One M8 review
The Google Camera app in Nexus Phone can dot it well because it explicitly asks the user to move the camera up by several inches to have a nice separation to shoot the second image. The end result is very neat, but it’s not very convenient. The feature is called Lens Blur.
A real-time preview that is as accurate as possible ensures that the user is aware of what the final picture will look like. This is important because you can move the camera, or make some changes at that moment. Sometimes, the difference between the preview and the final image is so great that it can be either very good (most of the time) or very bad (sometimes). In any case, the user should be in control here.
Image quality is great, but it’s not everything. Phone users expect the whole photography experience to be fast. This means that the camera must be operational ASAP. It means that
There are various strategies for this to happen, and they help enormously. For example, many manufacturers (OEMs) have a quick way to turn the phone on and launch the camera app. Samsung introduced the double-click on the Home button, but LG and others have it too now.
Huawei can turn the phone+camera on and snap a picture right away (P9 handset). Sony can even snap the photo before the display turns on with the XPERIA X handset. In both cases, it’s hard to aim/frame, but if that’s your only chance — you can take it.
In reality, most users don’t capture photos without seeing what they shoot, but once the Camera App is launched, the auto-focus (AF) speed is hugely important. Old handset used to have contrast-based AF, which is slow and inaccurate in low light. Modern phones use laser or phase-detection AF using dozens or a couple of hundreds of phase detection AF points.
However, the introduction of Dual-Pixel Diode AF by Samsung in the S7 changed the game by using 4 million AF points at any given time, making AF blazing fast on their 2016 high-end phones.
Once the first photo has been taken, you may want to do a quick follow-up, or maybe even capture as many photos as you can to make sure you have one good shot. This is where speed matters. How fast the camera can “move on” and get ready for the next shot is a huge deal.
You don’t want to be waiting around for a few seconds in-between shots. The BlackBerry PRIV took excellent pictures, even better than the Galaxy S6 (the king of that time), but it could do so because it spent considerable time computing enhancements to the image before “moving on”. I found it to be unacceptably slow — if only BB had done this work in the background instead…
Related: read our BlackBerry PRIV review
A great phone camera must come with the best settings out of the box, or be able to adapt dynamically to the situation — because we hate fiddling with the settings while taking a photo. This seems obvious, but we have seen plenty of phones who don’t have HDR (high dynamic range) ON by default.
You can learn more about High Dynamic Range, but it is basically a technology which tries to capture multiple exposures and choose, for every pixel in the photo, which one would work best in the final image. Thanks to HDR, you don’t have to worry as much about where the light comes from. The main HDR difference between phones is that some have to take two or more pictures, while others can do it in a single shot. The single shot HDR solution will avoid ghosting effects, so it’s better in most cases.
In general, preset “modes” are interesting, but they get marginal usage because people don’t want to switch “modes” before taking a photo. Most will trade-off quality for convenience. That’s why special modes can get our attention for specific situations, but they aren’t weighted heavily because they simply don’t get used much.
The built-in color balance algorithm is also key to building a great camera. Every phone manufacturer will come up with different methods and settings. Since this is more art than science, there’s no clear “best way to do it”. Fortunately, top phone manufacturers are pretty good. A good color balance and metering system means that the camera “will do the right thing” when selecting exposure and coloring to reproduce what you’re looking at.
Because most mobile photos are seen on your own smartphone, the quality of your display is critical. Two remarks from tech journalists are stuck in my mind. I sent photos to one, and that person said: “wow, it doesn’t even look like the same picture on my phone”. We both had $650+ phones, so this tells you that there are perceptible differences, even at the high-end.
After receiving side by side comparison shots from two high-end phones, another journalist said: “looking at this from my crappy (PC) display, both look great, I can’t tell the difference”. This shows that image quality can vary greatly from device to device.
Because your experience of mobile photography is based on what you see on your device, we think that the display quality is of extreme importance in the overall mobile photography experience.
The elements above are, we think, the key to a great phone camera experience. It’s more than just photo (file) quality. It’s also about being intuitive, easy and fast.
Phones’ cameras are not like their larger counterparts because they are used in a different context, with slightly different goals, and it could ruin your experience if you only look at the image quality at the expense of other aspects of mobile photography. The whole experience is what you are using – and paying for. To conclude, here’s a summary of the important things to look for: