23.12.2012 Views

Usability of Digital Cameras for Verifying Physically Based ...

Usability of Digital Cameras for Verifying Physically Based ...

Usability of Digital Cameras for Verifying Physically Based ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

7.4 Treating the Camera as a Black Box<br />

The basic goal <strong>of</strong> color management is that <strong>of</strong> reproducing color as it is seen by<br />

a human observer, like <strong>for</strong> example taking a photograph and displaying it on a<br />

screen in a way that the colors match the original scene. Our problem is different<br />

to that <strong>of</strong> color management, though. As stated in section 1, we primarily want<br />

to verify the light transport simulation but not necessarily the subsequent steps <strong>of</strong><br />

image synthesis (like e.g. the tone mapping step). First, we have to prove that our<br />

physically based rendering algorithm is implemented correctly. Only after that,<br />

we can verify the tone mapping and gamut mapping algorithms. Otherwise we<br />

would analyze two different steps in the rendering pipeline at once and – if errors<br />

would occur – we could not say whether the tone mapping step is the cause <strong>of</strong><br />

error or the rendering algorithm itself is not per<strong>for</strong>ming correctly. One possible<br />

solution to this problem is not to convert to XYZ or L ∗ a ∗ b ∗ color space at all, but<br />

to do the comparison in RGB space.<br />

After light hits the CCD, various processing steps are done in the camera.<br />

Some <strong>of</strong> them can be simulated – like e.g. the interaction <strong>of</strong> light and CCD ele-<br />

ments – but others, like e.g. color correction, cannot. Manufacturers usually do<br />

not publish details about the internal processes <strong>of</strong> their cameras. We do not have<br />

to care about steps like color correction if we use the RAW file <strong>for</strong>mat. But we<br />

still have to know the camera’s spectral sensitivities and probably some details<br />

about the behaviour <strong>of</strong> the CCD element itself to be able to convert a rendering to<br />

a RAW file that is identical to a RAW photo <strong>of</strong> the same scene.<br />

If we lack in<strong>for</strong>mation about the camera’s spectral sensitivities, the following<br />

method can be used. As we do not know exactly what is happening inside the<br />

camera, we treat it as a black box. The input data <strong>of</strong> our black box is light, i.e.<br />

light spectra. The output data is <strong>of</strong> type RGB, i.e. the RGB values <strong>of</strong> the final<br />

image. What we want to find now is a mapping that simulates what happens<br />

inside the black box <strong>for</strong> one pixel.<br />

As described in section 5, the result <strong>of</strong> the light transport simulation consists<br />

<strong>of</strong> one spectrum per pixel. If we apply the mapping to each <strong>of</strong> the spectra we get<br />

an RGB value <strong>for</strong> each pixel. This procedure is illustrated by the dotted line in<br />

figure 26. For verification purposes, this RGB value is now directly comparable<br />

to the RAW RGB value <strong>of</strong> the correspondent pixel <strong>of</strong> an image produced by a<br />

camera. A difference between these two values can be caused either by an error<br />

82

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!