CCD vs CMOS

Technical dissertation on the reconstruction capacity scene-referred between CCD and CMOS technology

As usual…
Please take a seat, have a scone and relax; at the end, you will have a deeper knowledge about the output of your cameras.

In this article, we will try to address one of the most debated topics in the field of photography, especially on forums: CCD colours are better than CMOS colours.
Over time, this assumption has gained the status of an axiom, becoming a sort of factual truth evident and indisputable, given a complete lack of a valid technical argument e an experimental examination.

To begin our analysis correctly we must first correct the indicted sentence from a lexical point of view. Defining “best” presupposes a quality standard, but this is simply derived from observing the photographs, as you come out of cameras or from respective raw converters; so the correct term should be “pleasant”. Many therefore argue that photos taken with CCD technology cameras offer more pleasant colours and give credit to the technological difference with the latest CMOS.
But is this difference really attributable to the difference between a CCD and a CMOS?
In short, no. But explaining why will be a little more complex.

The basic task of a digital sensor

Both CCD (Charge Coupled Device) and CMOS (Complementary Metal-Oxide) technologies Semiconductor) are based on the photoelectric physical principle in which a photon incident on an atom of a metal or semimetal causes the expulsion of an electron.

 

Both technologies employ suitably refined and doped silicon (Si) to be a semiconductor suitable for the construction of sensors:

The basic task of a digital sensor

Both CCD (Charge Coupled Device) and CMOS (Complementary Metal-Oxide) technologies Semiconductor) are based on the photoelectric physical principle in which a photon incident on an atom of a metal or semimetal causes the expulsion of an electron.

 

Both technologies employ suitably refined and doped silicon (Si) to be a semiconductor suitable for the construction of sensors:

In practice, it involves converting the incident photons into electrons that can be collected to form an electric charge proportional to the intensity of the exposure.
Any sensor based on this physical effect behaves in an ideally linear way, where a doubling of the incident photons corresponds to a doubling of the electric charge collected that is subsequently converted into a digital value by the A / D circuit.
A sensor constructed in this way, in its simplest form, is sensitive, however, beyond the spectrum of light visible to the human senses.
Here is a typical curve:

Which as we see extends well beyond the canonical 380 – 730nm range.

Furthermore, there is no chromatic discriminant, what we get is only a value of the intensity of the electric charge.
If our aim is to reproduce reality as we see it through our senses it is necessary to limit the response range of the sensor by introducing NUV (near ultraviolet) and NIR (near infrared) filters.
Below them there is a matrix of colour filters, typically in a Bayer scheme:

Spectral Sensitivity Functions (SSFs):

Now that the sensor has been limited in the visible range and equipped with a colour matrix, we have three distinct RGB curves.
Ideally, these curves should be superimposable to those of the standard CIE observer:

 

In this case, we see the CIE 1931 2 ° standard observer curves, the first proposed by Commission Internationale de l’Eclairage in 1931.
If the sensor curves matched, then the camera would see exactly like us, making the characterization phase superfluous. However, it is not possible to obtain identical SSFs to the standard observer.
An example of real SSF of a camera:

As a result, a camera does not, by itself, return realistic colours. The raw data present in RAW need a characterization, the role played by the camera profile.
Each sensor,  CCD or CMOS, has its own specific SSF curves, which are the result of the combination of all layers: the NIR and NUV filters, the colour matrix and the native sensitivity of the Si that depends on the production process. The variability between sensors can be slight or very significant, but already at this stage, it is practically impossible to attribute it to the difference between CCD and CMOS.
Can a CCD sensor have more efficient SSFs than a CMOS and offer a better raw signal by containing more information about colours?
This question is much more pertinent than the one usually posed in the forums, but the answer is not simple and cannot be unique. There may be CCD sensors with SSF more efficient than some CMOS, but also the opposite. What makes one SSF better than another?

The signal separation capability

As we have seen, no camera can ignore the characterization phase because no camera sees exactly like us; obviously, a rose remains red, the grass green and sky blue, but they are not accurate colours. In fact, no manufacturer designs their sensors trying to approach the raw colour rendering to that of reality; it is preferred to pursue the maximum possible separation capacity.
This ability can be explained as the sensitivity of the sensor to register the difference between two spectral inputs that would produce two XYZ triplets very close in the defined tristimulus space by the standard observer.
Basically the ability to distinguish two objects of extremely similar colour for the human senses; that allows you to have the raw information needed in the raw to be able to extrapolate the right reproduction of reality.
An efficient sensor seeks to maximize this capability over a large area of ​​the human locus e for a variety of illuminants, of which (usually) sunlight is the main one.

 

Our Test

After these premises, we come to the most interesting part: to compare in an experimental way two cameras, one CCD and one CMOS.
We chose the Nikon D200 and Nikon D700, respectively CCD and CMOS, because of the same manufacturer and they are sufficiently close in time. In this way, we try to isolate as much as possible the variable CCD vs CMOS.
All comparison tests are conducted on the spectral models of the cameras to rule out any possible disturbance.

-SSF curves:
First, let’s compare the SSFs of the two machines the test is based on:

 

There are design similarities, but also significant differences.

-Rendering of a virtual SG target:
Virtually exposing the multi-spectral image of a ColorChecker SG on their respective SSFs
we get for D65 (Daylight 6504K):

Under this illuminant, the cameras offer very similar output with minimal differences.
The sigma stands at 0.50 with a maximum error of 2.38 DeltaE 2000.

For StdA (Tungsten Bulb 2856K):

Even under the artificial light of tungsten, the raw output of the cameras is similar, with a sigma of 0.62 and a maximum error of 2.74 DeltaE 2000.

-Separation performance:
To analyze the signal separation capabilities of the two cameras we use a synthetic spectral target covering almost the entire human locus; only a small section is ignored close to the purple axis as the signal from any camera would be too low.

There are more than ten thousand unique spectral unique samples of the Reflectance class and therefore related to the illuminant that we choose. Here in D50:

The heat-map in u’v’ diagram represents the intensity of variation of the output in 16bit encoding for each sample with variation of 1 dCh (Delta Chromaticity), from zero (no capacity of separation) to 300 (maximum separation capacity). The graph shows the AdobeRGB gamut (red triangle) and the Pointer gamut as a reference (irregular perimeter). The Pointer gamut contains all the real objects observable in reflection.
As we can see, the performances are really very similar and completely equivalent on the practical level. The design similarity of the two sensors is evident, although they belong to two different technologies showing in the test a comparable result for the D50 illuminant.
For StdA we get:

Under the light of Tungsten, there are greater differences, but it would be difficult to assess which of the two sensors does better. Once again the performances are superimposable and the practical differences are null. We can therefore conclude that the difference between CCD and CMOS does not involve substantial changes to the output. There are other variables that lead to a difference in yield in the observed photographs between two cameras, be they CCD and CMOS or belonging to the same technology.

Final test:
For further proof of our conclusions, we will adopt a multispectral image of a scene measured in the laboratory and we will show the results of the scene-referred reconstruction with the Cobalt profiling. The scene is defined from 400 to 700nm with a range of 10nm.

Calculation of the image under D65 for standard observer 1931 2°:

Rendering D200 with a profile for D65:

Rendering D700 with a profile for D65:

Calculation of the image under StdA for standard observer 1931 2°:

Rendering D200 with a profile for StdA:

Rendering D700 with a profile for StdA:

Final Conclusions:

The incidence of the CCD vs CMOS variable is not quantifiable when considering projects qualitatively equal. The differences in signal separation capability depend on the overall quality of the sensor, in which the choice of CCD and CMOS technology is only one of the many elements.
The cameras we reviewed have similar hardware performance in regards to colour discernment and differences are only detectable under laboratory conditions.
In the normal experience of use, the difference between the look of a CCD and a CMOS image is
attribute to other factors:

– When the camera has been released
The characterization of the sensor
– The technology of the camera profile
– The colour correction added to the profile itself

As we have already seen in the tutorial about Adobe profiling, the characterization of data raw materials takes the lion’s share in generating differences between different cameras. But if, as evidenced in the final test, the cameras are properly characterized and profiled with the same accuracy and technology, then it will be practically impossible to perceive differences in the scene-referred reconstruction of the photographed scene.

0 Comments