Year after year, smartphone cameras have become more capable, more versatile, and more of a reason to leave your DSLR at home. So what are the tech innovations that have made the Pixel 2, the iPhone X, the Galaxy S9 and others such good photo takers compared to that old iPhone 6 or Galaxy S5?
Obviously, in the technical aspects of photography and cameras can get very nuanced. But in broad strokes, here’s a look at the ways some key technologies have improved over the years to make you ‘grams sharper and you snaps brighter.
At the core of the camera spec is the number of megapixels i captures—simply put, the resolution of the image that gets captured. It was by no means the first smartphone with a camera, but for comparison purposes the original 2007 iPhone came rocking a 2-megapixel rear camera with a fixed focus, capable of capturing images 1600 x 1200 pixels in size. Today’s Galaxy S9 and iPhone X have 12-megapixel cameras.
In the early days of smartphone cameras, megapixels were the yardstick that these components were measured by: More megapixels meant a better camera, generally speaking. But that isn’t necessarily true now and it wasn’t necessarily true then, because there are a whole host of other factors that affect image quality, as you can see from the extensive list below.
The problem with cramming more pixels into the same-sized sensor is the pixels get smaller, and let in less light. Remember the HTC UltraPixels introduced in 2013? That was an attempt to reduce megapixels, increase pixel size, and therefore capture more light (and therefore detail) as the camera shutter flashed open for a split second. HTC was on to something, because today the megapixel race has all but ended, with smartphone makers making improvements elsewhere instead.
It is a truth universally acknowledged that the bigger the image sensor in a camera, the better the end result (essentially, it lets the camera capture more light and more color detail). With any camera, you’re relying on several components working well together, but the image sensor is a crucial one.
It’s a shame then that there’s not much room inside smartphones—mobile camera sensors tend to be between 1/2.3 and 1/3 inches, much smaller than those inside DSLRs and even quality point-and-shoot cameras, though manufacturers are often coy about the specs in this regard. In fact, sensor size hasn’t changed much over the years in smartphone photography, because of those physical limitations, and it’s usually been in other areas where improvements have been made.
You’ll have a hard time digging down into any phone’s specs to find the image sensor size for the camera advertised, but the Nexus 6P was an exception—its 1/2.3-inch sensor is on the larger end of the scale, particularly for 2015, though sensor size alone isn’t a spec where modern-day devices are all that much better than phones of yesteryear. Note too the 6P’s 1.55 μm (micrometer) pixel size, larger than the 1.4 μm pixels in the Pixel 2, with a 1/2.6-inch sensor.
And of course for all the cameras that don’t advertise their sensor size, enterprising teardown artists do the work, and usually reveal that what we’re working with is teensy.
On smartphone cameras as well as regular cameras, aperture controls the amount of light that gets to the image sensor. In a regular camera, aperture is manipulated to optimize for lighting conditions, blur, and desired depth of field, but in the world of smartphone cameras, in which optics are severely constrained, phone makers tend to optimize for having the widest aperture possible. This allows cameras to capture lots of light in all of those dark settings in which we all love to take photos, while keeping the shutter speed quick enough that your photo doesn’t come out blury. (Super-wide apertures have their downsides, but we’ll set them aside for now.)
Aperture size is measured in f-stops, and the smaller the f-stop, the wider the aperture (or opening). Last year the LG V30 camera set a new high watermark with an f/1.6 aperture, since surpassed by the dual aperture tech on the Samsung Galaxy S9, which lets you switch between f/1.5 and f/2.4 apertures, depending on what you’re trying to achieve with your pictures. You can get a great close-up look at the mechanism in this JerryRigEverything video.
Wider apertures have been made possible through the years as lens manufacturing quality has increased—something that’s of paramount importance if you’re letting more light in and want to keep a sharp, focused picture.
Maybe not as important as some other components, but the on-board camera flash has made strides in the years that smartphones have been with us. Older phones, particularly Nokia and Sony models, made use of Xenon flash—very bright, but bulky and power-hungry too.
Today, phones use LED or dual-LED flash to produce a more subtle effect.. In the case of dual-LED, two LEDs are used with slightly different color temperatures, theoretically producing an end result with a better balance of colors that isn’t completely unnatural. Look closely at the flash on the back of your phone and you may well see the two tiny bulbs.
The most recent iPhones include even more improvements, and show how various smartphone camera specs work together to produce better results than the previous generation. As well as introducing quad-LED flash in 2016, the 2017 models debuted a feature called Slow Sync: It keeps the shutter open longer to capture more light and reduce the light needed from the flash, which can flash less brightly for less time.
Maybe you’ve never thought much about the focus on your smartphone’s camera if you’re not shooting sports or wildlife, but it’s pretty significant in the overall quality of your shot. It works by moving the camera lens on tiny motors to make the object of your photo nice and clear, but a host of other hardware and software factors are at play—and down the years, phone autofocus has become much more accurate, and much faster.
Before 2015, phone cameras focused solely based on the contrast they could detect in a scene. Starting with the Galaxy S5 and iPhone 6, phase detection was added, built right into the sensor: It uses the information coming in from both sides of the lens to calculate where the perfect focus is (where the points of light should meet). It’s faster than the standard contrast detection method, but it’s still not great in low light.
Enter more smartphone camera tricks. The dual pixels used on the most recent Galaxy phones, for example, turn every pixel into a little phase detection system, improving performance in darker scenes. For its Pixel phones, Google went with a time-of-flight infrared laser to measure distances quickly in any lighting situation. Again, it shows manufacturers getting creative, and in different ways, to improve photos taken on mobile.