07.12.2012 Views

Play-Persona: Modeling Player Behaviour in Computer Games

Play-Persona: Modeling Player Behaviour in Computer Games

Play-Persona: Modeling Player Behaviour in Computer Games

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

6) Help-on-Demand: The number of times help was requested, H. A key feature of TRU is the<br />

focus on puzzle solv<strong>in</strong>g. A typical puzzle could be a door which requires specific switches to be<br />

pressed <strong>in</strong> order to open. There are more than 200 registered puzzles <strong>in</strong> the game, which the<br />

players have to solve <strong>in</strong> order to progress through the game narrative (see Fig. 2). The game<br />

features a native Help-on-Demand (HoD) system which players can consult <strong>in</strong> order to get help<br />

with solv<strong>in</strong>g the puzzles. The player can either request a h<strong>in</strong>t about how to solve the puzzle or a<br />

straight answer. The H value <strong>in</strong>corporates the total number of times that each player requested<br />

either a h<strong>in</strong>t or an answer. The data from the 1365 players reveal that players generally either<br />

request both h<strong>in</strong>ts and answers from the HoD-system, or no help at all, for specific puzzles. It was<br />

therefore decided to comb<strong>in</strong>e the h<strong>in</strong>t and answer requests <strong>in</strong>to the H aggregated value. The H<br />

value ranges from 0 to 148, with an average of 29.4 per player.<br />

6. EMERGENT SELF-ORGANIZING MAPS<br />

The self-organiz<strong>in</strong>g map (SOM) [6] or (Kohonen map) iteratively adjusts a low dimensional<br />

projection of the <strong>in</strong>put space via vector quantization [17]. A SOM consists of neurons organized <strong>in</strong> a<br />

low (2 or 3) dimensional grid. Each neuron <strong>in</strong> the grid (map) is connected to the <strong>in</strong>put vector<br />

through a d-dimensional connection weight vector m = {m1; : : : ;md} where d is the size of the<br />

<strong>in</strong>put vector, x. The connection weight vector is also named prototype or codebook vector. In<br />

addition to the <strong>in</strong>put vector, the neurons are connected to neighbor neurons of the map through<br />

neighborhood <strong>in</strong>terconnections which generate the structure of the map: rectangular and<br />

hexagonal lattices organized <strong>in</strong> 2-dimensional sheet or 3-dimensional toroid shapes are some of the<br />

most popular topologies used. SOM tra<strong>in</strong><strong>in</strong>g can be viewed as a vector quantization algorithm which<br />

resembles k-means [17]; however, what differentiates SOM is the update of the topological<br />

neighbors of the best-match<strong>in</strong>g neuron: the whole neuron neighborhood is stretched towards the<br />

presented <strong>in</strong>put vector. The outcome of SOM tra<strong>in</strong><strong>in</strong>g is that neighbor<strong>in</strong>g neurons have similar<br />

weight vectors which can be used for project<strong>in</strong>g the <strong>in</strong>put data to the two dimensional space. SOMs<br />

can be used for cluster<strong>in</strong>g of data (unsupervised learn<strong>in</strong>g) for the aforementioned properties. For a<br />

more detailed description of SOMs, the reader is referred to [6].<br />

The power of self-organization — which generates emergence of structure <strong>in</strong> the data — is disused<br />

when small SOMs are utilized. The topology preservation of the SOM projection is of little use and<br />

the advantage of neighbor neuron relation is neglected which makes a small SOM almost identical<br />

to a k-means cluster<strong>in</strong>g algorithm. Us<strong>in</strong>g large SOMs — called Emergent Self-Organiz<strong>in</strong>g Maps<br />

214

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!