How a digital sensor “sees” the world.
Just some facts about the digital photography and the magic behind that.

I thought I’d talk about the colours of the sensors for a moment, but going beyond the characterization phase.
So no about profiles this time, just the sensors, with their raw colour-identifying abilities.
So much to answer the curiosity of how sensors manage independently.
Ideally in these tests, the development chain is interrupted after Linearization -> White balance
The methodology: the test sample is a multispectral image of the ColorChecker SG, mathematically developed on the ssf curves of a Canon 5dmk2 and a Sony A7r2, plus a reference under the 2010 10 ° standard observer, the best we have today for both definitions of colour.
The resulting images are, for convenience, brought back to a ProPhoto space and integrated.
What is a multispectral image? In practice, it is a file, or collection of files, which defines an image with the spectral data of the objects present in the scene.
Therefore, instead of the usual RGB data, the spectral absorption data are reported for each pixel in the multi-spectral image.
The multi-spectral SG image I built is made up of 36 monochrome tiffs, one photo for every 10nm of the spectrum, from 380 to 730.
Such a thing:
Except that the single-pixel value does not represent a “grey”, but a measure of the spectral section.
And now we come to the comparisons:
For D65:
For StdA:
For F2:
As you can see they are very similar; maybe more than one might expect. This is because despite being sensors released at different times and with different design philosophies, their task is the same. And the “famous colour science” is always the same for everyone.
But who does it better in the end? Who is closest before the characterization phase?
Canon, in order from left to right D65, StdA, F2:


Sony, in order from left to right D65, StdA, F2:


In these three illuminants, the Sony sensor has a better raw performance. But considering the time between these two models, I think there is nothing strange.
We could also extend the test set to other illuminants and maybe see the Canon take revenge (I doubt it).
To conclude: the problems are the same for everyone and the engineering of a sensor always has to deal with the physics and science of colour; there is little to do about it.
Both sensors do an excellent job, of providing the raw converter with information that is already approximately correct. Ideally, it would be up to the profile to bridge the gap between the raw data and the real reference and this, as already widely debated, is something often and willingly completely avoided.
0 Comments