Samsung Galaxy A7 triple cameras explained: how do they work?
New
Galaxy A7 (2018)
, which was
announcement
earlier today holds the accolade of being the first Samsung smartphone to feature triple rear cameras. It is quite surprising that Samsung is releasing a mid-range smartphone with such a distinctive feature that has been exclusive to flagship devices such as
Huawei's P20 Pro
, where the
Oppo R17 Pro
.
But are the Galaxy A7's triple rear cameras just a marketing ploy to boost sales of the Samsung Galaxy A-series devices? Well, we can't be sure. But if you're wondering how a triple rear camera can improve imaging output, let's jump right into the hardware and see how it works.
Main camera
The Galaxy A7's imaging hardware has a
24MP RGB sensor
with an F/1.7 aperture, which sits in the middle of the vertical lens stack. Above is a
5MP depth sensor
with F/2.2 aperture, while a
8MP wide-angle lens
with F/2.4 and a 120 degree field of view is at the bottom. On the other hand, Huawei's P20 Pro packs a 40MP RGB sensor, an 8MP telephoto lens, and a 20MP monochrome lens.
As for the practical applications of these sensors, the Galaxy A7's 24MP RGB sensor acts as the primary camera and is also capable of pixel-binning (combining four pixels into one) to extract more detail from the scene in the image. final image, especially in low light conditions.
AI Scene Optimizer
< p>Main camera gets functionScene Optimizer
which was first seen in the
Galaxy Note 9
. This allows the camera to use AI algorithms to automatically recognize scenes and adjust camera settings such as white balance, exposure, contrast and brightness values to match. the scene. Currently, the feature can identify 19 scenes, including food, scenery, street view, night scene, animals, and beach, among others.
Scene Optimizer really makes a difference when it comes to overall image quality, as we saw in our
review
of Note 9 functionality.
Secondary camera
The 5MP (F/2.2) sensor sits on the top of the stack and is used to capture depth information in the scene. The camera calculates the distance between different objects in its view, so it can separate foreground and background and create a better depth of field effect for bokeh shots. Obviously, the SoC helps with the calculation of depth and the blur effect is applied to the edges to make the object stand out. Samsung could also let users adjust the blur intensity, as seen in its flagships.
You can argue that a monochrome sensor is better for depth sensing because it can capture more light, but secondary monochrome lenses with lower resolutions are more expensive to procure, which is why they're commonly seen on devices top of the line.
Ultra-wide camera
Finally, there's an ultra 8MP (F/2.4)-wide sensor with a 120-degree field of view that's almost the same as the regular FOV of human eyes. So, on paper, the Galaxy A7's 8MP ultra-wide lens will let you snap a shot of anything in your field of vision.
The wide-angle lens comes into play when you want to capture a group shot. or a landscape and want the image to be as wide as possible to get the whole scene. The f/2.4 lens has become a mainstay on upper mid-range devices with dual camera setups when it comes to capturing wide-angle shots, and will serve Galaxy A7 users very well.
So you have it. With everything Samsung has revealed so far, the Galaxy A7 might actually have a pretty good camera. Of course, there are a lot of tests to be done. We don't even have official camera samples yet, so it's clear that Samsung is just announcing specs for now. Stay tuned for more on this exciting new Samsung Galaxy A series phone.