28.01.2015 Views

A Bradley-Terry Artificial Neural Network Model for Individual ...

A Bradley-Terry Artificial Neural Network Model for Individual ...

A Bradley-Terry Artificial Neural Network Model for Individual ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

An ANN <strong>Model</strong> For <strong>Individual</strong> Ratings in Group Competitions 7<br />

winner and the question asked is always what is the probability that team<br />

A will win, the input <strong>for</strong> w A will always be 1 and the input <strong>for</strong> w B will<br />

always be −1. The equation <strong>for</strong> the model can be written as:<br />

1<br />

Output = Pr(A def B) =<br />

. (4)<br />

1 + e−(wA−wB) which is the same as (3), except w A and w B are substituted <strong>for</strong> θ A and θ B .<br />

The model is updated applying the delta rule which is written as follows:<br />

∆w i = ηδx i (5)<br />

where η is a learning rate, δ is the error measured with respect to the output,<br />

and x i is the input <strong>for</strong> w i . The error, δ is usually measured as:<br />

δ = Target Output − Actual Output. (6)<br />

For the ANN <strong>Bradley</strong>-<strong>Terry</strong> model, the target output is always 1 given the<br />

assumption that A will defeat B. The actual output is the output of the<br />

single node ANN. There<strong>for</strong>e, the delta rule error can be rewritten as:<br />

δ = 1 − Pr(A def. B) = 1 − Output, (7)<br />

and the weight updates can be written as:<br />

∆w A = η(1 − Output) (8)<br />

∆w B = −η(1 − Output) (9)<br />

Here, x i is implicit as 1 <strong>for</strong> w A and −1 <strong>for</strong> w B , again since A is assumed to<br />

be the winning team. It is well-known that a single-layer ANN trained with<br />

the delta rule will converge to the global minimum as long as the learning

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!