05.01.2013 Views

Perceptual Coherence : Hearing and Seeing

Perceptual Coherence : Hearing and Seeing

Perceptual Coherence : Hearing and Seeing

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Information =−Σ. Pr(x i ) log 2 Pr(x i ). (3.3)<br />

Therefore, events with low probabilities contribute little to the overall<br />

information in spite of their high surprise value. The maximum information<br />

<strong>and</strong> uncertainty occur when each stimulus has equal probability, <strong>and</strong> in that<br />

case the information reduces to log 2 N, where N refers to the number of<br />

stimuli. As the probability distribution becomes more unequal, the overall<br />

information progressively decreases. If one stimulus occurs all of the time,<br />

the information goes to 0. The measure of information is bits if using<br />

log 2—it is only a number, like a percentage, without a physical dimension.<br />

If we want to define the information for combinations of two variables<br />

(x i , y j ), then the information summed over all xs <strong>and</strong> ys becomes<br />

Information =−ΣΣPr(x i , y j ) log 2 Pr(x i , y j ). (3.4)<br />

If the two variables are independent, then<br />

Characteristics of Auditory <strong>and</strong> Visual Scenes 99<br />

Information =−ΣΣPr(x i ) Pr(y j ) log 2 Pr(x i ) Pr(y j ) (3.5)<br />

Information =−Σ. Pr(x i ) log 2 Pr(x i ) −ΣPr(y j ) log 2 Pr(y j ) (3.6)<br />

Information = I(x) + I(y). (3.7)<br />

But natural stimuli are never made up of independent (or r<strong>and</strong>om) variables,<br />

<strong>and</strong> natural messages are never composed of independent r<strong>and</strong>om sequences<br />

of elements. For example, the letter q is nearly always followed by<br />

the letter u. After identifying a q, knowing that u occurrs next rarely gives us<br />

any information (or conversely, that letter position does not create any uncertainty).<br />

We measure the actual information in terms of conditional probabilities;<br />

that is, given the letter q, what is the probability of u in the following<br />

letter slot? The difference between the information based on independent<br />

variables <strong>and</strong> that actually measured is the constraint or structure in the<br />

stimulus or the mutual information between stimulus <strong>and</strong> response.<br />

There are several important points about this measure of information:<br />

1. The information measure can be generalized to variables that have continuous<br />

distributions that often involve integrating over the range of the<br />

values. Moreover, the information measure can be further generalized<br />

to r<strong>and</strong>om functions such as sound pressure waveforms over time.<br />

2. The information does not depend on the central tendency (i.e., mean)<br />

of the (stimulus or response) distribution. The information does depend<br />

on the number of possible states. If all states are equally probable, increasing<br />

the number of states twofold increases the information by 1 bit.<br />

3. As Rieke et al. (1997) pointed out, the number of states for continuous<br />

variables is infinite. To make the measure valid for continuous variables,<br />

we need to create discrete bins in which we cumulate the values of the<br />

variable at discrete time points. Thus, our measurements will always

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!