The new Samsung Galaxy S7 image sensor explained

Samsung

has long been a notable player in the mobile image sensor game and the company is taking a fresh approach to photography with the introduction of its

Galaxy S7

range of smartphones. The company is tossing its high-pixel-count 16-megapixel sensor from last year's flagships in favor of a lower-resolution 12-megapixel sensor that features larger individual pixels. So let's see why Samsung is making this change .

Why use bigger pixels?

We are regularly told, perhaps incorrectly, that smartphone image sensors are catching up with the functionality and quality offered by much more expensive equipment. However, the fact is that the compact size of the camera components of smartphones smartphones forces designers to compromise. We've often seen engineers increase sensor resolutions in an effort to hide noise and blemishes, but this comes at the expense of color quality and low-light performance. Samsung seems to be tackling this particular trade-off head-on, cutting the megapixel count and opting for larger pixel sizes.

As photography enthusiasts probably know, it's not the number of megapixels that matters so much for capturing high-quality images. Extra pixels come in handy for cropping images at a later date, and you might be interested to know that only 6 megapixels are enough for detailed A4-size image printing. Instead, it's the size of the actual image sensor and the ability of the photosites (or sensor pixels) to capture enough light that are very important for image quality.

High-end smartphone image sensors are smaller than 30mm2, making them significantly smaller than DSLR sensors.

Technological point

High-end image sensors in smartphones are typically smaller or smaller than 25mm2 (1/2.5′′), making them noticeably smaller than high-end image sensors found in digital SLR cameras.

Simply put, larger image sensor sizes allow for larger photosites, which means more light per pixel. Typically, this results in higher dynamic range, less noise, and better low-light performance. brightness than smaller image sensors with too high a pixel count. Of course, compact smartphone form factors mean we'll never have enough space to match DSLR sensor sizes, so compromises have be made to strike the right balance between noise, resolution, and low-light performance. Rather than chasing extra resolution, Samsung seeks to increase image quality by maximizing the available space for each photosite.

Of course, CMOS image sensor design is a bit more complicated than that. The backplane electronics and isolation between pixels can have a substantial impact on attributes such as noise due to crosstalk between pixels. The lens placed on the sensor and the image signal processor used to interpret the data are also important. Unfortunately, Samsung hasn't spilled enough beans for us to piece together everything that's going on inside. inside its latest smartphone camera, but here's what we know.

Samsung Sensor Specifications

Samsung has increased its pixel size from 1.0um in its Isocell sensors to 1.4μm in the Galaxy S7, allowing additional light capture in each photosite. This marks a 56% increase in pixel size over to Galaxy S6.

It's not as large as the 2.0m sizes found in HTC's Ultrapixel technology and is still slightly smaller than the 1.55μm photosites found in the Nexis 6P's sensor, which has always been an excellent performs in our own tests. However, Samsung has also greatly increased the size of the aperture in the accompanying lens so that extra light can flow to the sensor. The Samsung Galaxy S7 lens has an aperture of F /1.7, up from the F/1.9 aperture of the Galaxy S6's 16-megapixel camera, delivering up to 25% more brightness.

Combined, Samsung says this allows the Galaxy S7 and S7 Edge's camera to capture 95% more light than its predecessor, which should greatly improve low-light performance and help reduce image noise. , a common problem with small smartphone sensors.

The future of fast focus

The scope of Samsung's changes with its new image sensor doesn't stop at capturing light. The company is also the first to implement dual-pixel on-chip phase detection on every pixel of its sensor.

This phase autofocus technology has been used in some DSLR camera sensors and works by detecting the phase of light received at two separate pixel locations. This information can then be used to focus on an object specific or part of the frame, in a way that is not too different from how the human eye perceives depth.

Other image sensors, such as the high-end Sony Exmor RS models, implement a small number of phase-detection diodes across the sensor, but these typically only represent about one percent of the pixels. Samsung is the first company to implement phase detection on every pixel of its sensor. The main advantage here is that focusing can be achieved much faster than before and focus time is less dependent on the content of the pixels located in certain locations on the sensor. Samsung demonstrates how fast the Galaxy S7 (right) can focus compared to the Galaxy S6 (left) in the video below:

The Samsung Galaxy S7 certainly set itself up with a pretty different take on photography than the company's previous flagship models, and the theory seems right on paper to really improve image quality. It's the final image quality that matters most and a good image isn't just about the sensor. We'll be putting the handset's camera through plenty of tests when it comes to doing a full review.

Samsung Galaxy Note 5 has the third best smartphone camera according to DxOMark

Features

News

Camera

,

Samsung

,

Samsung Galaxy S

comments