20.02.2013 Views

Scarica gli atti - Gruppo del Colore

Scarica gli atti - Gruppo del Colore

Scarica gli atti - Gruppo del Colore

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

esolution multispectral images, and outline the methodology employed to operate<br />

it.<br />

2. ‘Narrow-band’ multispectral imaging<br />

Compared to RGB imaging, which is based on the theoretical framework of<br />

colorimetry [7] and therefore ‘synthetizes’ color stimuli from the contributions of<br />

objects, environment, and observer, multispectral imaging attempts to estimate<br />

objects’ reflectances. It is therefore unaffected by the typical problems of RGB<br />

imaging, including device-dependency [1], metamerism [7,1], and accuracy<br />

limitations of the device sensors [8,9]. In fact, despite the availability of<br />

colorimetric device-independent color spaces and international standards such as<br />

ICC profiles [10] and sRGB [11], multispectral imaging remains the only way to<br />

achieve complete independence from both the environment and observer.<br />

Motivations for the use of multispectral imaging can be found in everyday<br />

experiences like the phenomenon of metamerism, which shows that there exist<br />

different ‘physical’colors (spectra) that sometimes get the same colorimetric<br />

representation. At the same time, physics outlines that while color representations<br />

in RGB imaging and colorimetry use parameters whose ultimate physical<br />

significance is that of measuring the amount of light energy that is ‘registered’ by<br />

the sensors considered (both human and electronic), such parameters have only an<br />

indirect relationship with the fact that the objects observed are actually able to<br />

reflect light towards those sensors.<br />

The aim of multispectral imaging is then that of describing this ‘ability’ of color<br />

surfaces as mo<strong>del</strong>led by their reflectance function; as this function depends on the<br />

physical properties of the surfaces considered, it is also much more invariant than<br />

environmental conditions and observers sensitivity, and therefore more<br />

‘fundamental’.<br />

In general, the acquisition performed using a given sensor will return a value a in<br />

the form<br />

(1) = E(<br />

) R(<br />

λ)<br />

S(<br />

λ)<br />

26<br />

λ<br />

2<br />

∫<br />

a λ dλ<br />

.<br />

λ<br />

1<br />

This value integrates contributions from the energy E that reaches the physical<br />

sample observed, the color reflectance R of the sample, and the ‘sensitivity’ S of<br />

sensor. The integration with respect to the wavelength λ is performed in the range<br />

λ1 to λ2 of the sensor's sensitivity; if this range exceeds that of the visible light<br />

spectrum, then appropriate steps must be taken to cut unwanted radiation off.<br />

To obtain an estimation of the reflectance R, two different approaches are currently<br />

used in multispectral imaging [12]. On one hand, direct measures of these values<br />

can be attempted if the device’s sensors are sensitive to a very narrow wavelength

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!