25.07.2013 Views

July 2006 Volume 9 Number 3 - CiteSeerX

July 2006 Volume 9 Number 3 - CiteSeerX

July 2006 Volume 9 Number 3 - CiteSeerX

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

To analyze the convergence behavior of the particles, we testify whether the swarm evolves to the same<br />

optimization goal. We propose the information entropy for measuring the similarity convergence among the<br />

particles as follows. Let pij be the binary value of the jth bit for the ith particle, i = 1, 2, …, R, and j = 1, 2, …,<br />

NK, where R is the swarm size. We can calculate probj as the conditional probability that value one happens at<br />

the jth bit given the total number of bits that take value one in the entire swarm as follows.<br />

prob<br />

R<br />

∑i=<br />

∑ ∑<br />

j = R<br />

p<br />

1 ij .<br />

NK<br />

i=<br />

1 h=<br />

1<br />

The particle entropy can be then defined as<br />

Entropy = −<br />

NK<br />

prob log prob .<br />

∑ j=<br />

1<br />

p<br />

ih<br />

j<br />

2<br />

( )<br />

j<br />

The particle entropy is smaller if the probability distributions are denser. As such, the variations of particle<br />

entropy during the swarm evolution measure the convergence about the similarity among all particles. If the<br />

particles are highly similar to one another, the values of the non-zero probj would be high, resulting in denser<br />

probability distributions and less entropy value. This also means the swarm particles reach the consensus about<br />

which test items should be selected for composing the test sheets.<br />

Figure 6 shows the variations of particle entropy as the number of generations increases. It is observed that the<br />

entropy value drops drastically during the first 18 generations since the particles exchange information by<br />

referring to the swarm’s best solution. After this period, the entropy value is relatively fixed due to the good<br />

quality solutions found and the high similarity among the particles, meaning the particles are resorting to the<br />

same high quality solution as the swarm converges.<br />

6. Conclusions and Future work<br />

Figure 6. The particle entropy as the number of generations increases<br />

In this paper, a particle swarm optimization-based approach is proposed to cope with the serial test sheet<br />

composition problems. The algorithm has been embedded in an intelligent tutoring, evaluation and diagnosis<br />

system with large-scale test banks that are accessible to students and instructors through the World-Wide Web.<br />

To evaluate the performance of the proposed algorithm, a series of experiments have been conducted to compare<br />

the execution time and the solution quality of three solution-seeking strategies on twelve item banks.<br />

Experimental results show that serial test sheets with near-optimal average difficulty to a specified target value<br />

can be obtained with reasonable time by employing the novel approach.<br />

For further application, collaborative plans with some local e-learning companies are proceeding, in which the<br />

present approach is used in the testing and assessment of students in elementary school and junior high schools.<br />

13

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!