08.02.2013 Views

Bernal S D_2010.pdf - University of Plymouth

Bernal S D_2010.pdf - University of Plymouth

Bernal S D_2010.pdf - University of Plymouth

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

6.1. ANALYSIS OF RESULTS<br />

The graph in Figure 5.10 suggests that the optimum v;iluc for Kc\groi.p is approximately 10.<br />

while the graph in Figure 5.\ 1 suggests [hat the optimum value for Kcigroup 's approximately<br />

15. If there are not enough CI features per group, some inpul spalia] arrangements <strong>of</strong> SI nodes<br />

will not be captured, decreasing Ihe categorization performance. Similarly, if there are too<br />

many C \ features per group, it is more likely that high values will be obtained in all the groups,<br />

thus reducing the informative value <strong>of</strong> the node. The number <strong>of</strong> features per group is therefore<br />

crucial for the feedforward recognition process and should provide a compromise between the<br />

two opposed effects described above.<br />

Another factor that has proven crucial for successful categorization is the number <strong>of</strong> non-zero<br />

elements in the S2-C2 weight matrix, which can be considered equivalent to the sparseness <strong>of</strong><br />

the matrix. There is evidence suggesting synaptic connectivity is sparse in feedforward conical<br />

circuits and thai firing patterns <strong>of</strong> cortical neurons exhibit sparse distrihuled representations, in<br />

which only a few <strong>of</strong> a large population <strong>of</strong> neurons are active (Quiroga et al. 2005, Murray and<br />

Kreutz-Delgado 2007, Karklin and Lewicki 200.1. Olshausen 2(H)3). Sparse coding strategies<br />

have proven to be essential to make eflicient use <strong>of</strong> overcomplete representations, such as those<br />

found in VI, making it easier to lind higher order correlations, increasing the signal-to-noise<br />

ratio and increasing the storage capacity <strong>of</strong> associative memories (Murray and Kreutz-Delgado<br />

2007). Furthermore, they can improve pattern matching, since they lower the probability <strong>of</strong><br />

false matches among elements <strong>of</strong> a pattern (Oishausen 2003).<br />

The model results shown in Figures 5.12, 5.13. 5.14 and 5.15 indicate that sparse S2-C2 weight<br />

matrices, with < 10% <strong>of</strong> active connections, improve feedforward categorization. An example<br />

<strong>of</strong> one such sparse connectivity matrix is shown in Figure 4.11. As ex]>ected. the optimum<br />

number <strong>of</strong> non-zero elements is proportional to the S2 RF size, l-or S2 RF size=4x4, the op­<br />

timum value <strong>of</strong> non-zero elements is one. while for higher S2 RF sizes the value lies between<br />

four and eight. As previously staled, sparse coding strategies account for this phenomenon, as<br />

more sparse S2-C2 connections make it less hkely for two different objects to yield the same<br />

C2 response pattern (false positive), thus increasing selectivity. However, when the number <strong>of</strong><br />

non-zero elements is too low, the distorted versions <strong>of</strong> the same object might be categorized as<br />

23(1

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!