How a digital sensor “sees” the world.
Just some facts about the digital photography and the magic behind that.

In today’s world of advanced technology, sensors have become an integral part of our daily lives. They are used in various applications, ranging from smartphones to industrial machinery. One of the most fascinating aspects of sensors is their ability to identify colors. In this article, we will delve deeper into the raw capabilities of sensors and how they manage to do so independently, without the need for any external assistance. When it comes to identifying colors, sensors work on the principle of light reflection and absorption. Every object in the world reflects and absorbs light in a unique way, and sensors are designed to capture and interpret these differences. They do this by measuring the intensity of light reflected from an object and then comparing it to a pre-defined set of color values. Sensors are capable of identifying a wide range of colors, from the basic primary colors to the more complex shades and hues. They are also able to distinguish between different levels of brightness and saturation, which is crucial in many applications such as image processing and color correction. Another interesting aspect of sensor technology is its ability to adapt and learn. With the help of machine learning algorithms, sensors can be trained to recognize new colors and patterns, making them even more versatile and efficient. In conclusion, the capabilities of sensors in identifying colors are truly remarkable. By understanding the raw abilities of sensors, we can gain a deeper appreciation for the technology that surrounds us. Whether it’s in our smartphones, cameras, or industrial machinery, sensors play a crucial role in making our lives easier and more efficient.
In this discussion, I’d like to delve into the sensor colors and their capabilities beyond the characterization phase. Instead of focusing on profiles, let’s explore the raw color-identifying abilities of these sensors. This will help satisfy our curiosity on how these sensors operate independently. Let’s get started!
Ideally in these tests, the development chain is interrupted after Linearization -> White balance
The methodology:
the test sample is a multispectral image of the ColorChecker SG, mathematically developed on the ssf curves of a Canon 5dmk2 and a Sony A7r2, plus a reference under the 2010 10 ° standard observer, the best we have today for both definitions of colour. The resulting images are, for convenience, brought back to a ProPhoto space and integrated.
What is a multispectral image? In practice, it is a file, or collection of files, which defines an image with the spectral data of the objects present in the scene.
Therefore, instead of the usual RGB data, the spectral absorption data are reported for each pixel in the multi-spectral image.
The multi-spectral SG image I built is made up of 36 monochrome tiffs, one photo for every 10nm of the spectrum, from 380 to 730.
Such a thing:
Except that the single-pixel value does not represent a “grey”, but a measure of the spectral section.
And now we come to the comparisons:
For D65:
For StdA:
For F2:
Upon observation, it is evident that these sensors share a striking resemblance, perhaps more so than anticipated. This is due to the fact that despite being released at varying times and constructed with distinct design philosophies, they serve the same purpose. Additionally, the “renowned color science” remains consistent across all models.
But who does it better in the end? Who is closest before the characterization phase?
Canon, in order from left to right D65, StdA, F2:


Sony, in order from left to right D65, StdA, F2:


When comparing these three illuminants, it is evident that the Sony sensor exhibits superior raw performance. However, given the time gap between the two models, this is not surprising. It may be worthwhile to expand the test set to include other illuminants, as this could potentially allow the Canon to redeem itself (although I am skeptical).
In summary,
the challenges faced by all individuals in sensor engineering involve the principles of physics and color science. Unfortunately, there is limited room for improvement in this area. Both sensors perform admirably in supplying the raw converter with information that is already approximately accurate. Ideally, it is the responsibility of the profile to close the gap between the raw data and the actual reference. However, this crucial step is frequently and deliberately disregarded, as has been extensively discussed.
0 Comments