1. Introducti<strong>on</strong>1.1 BackgroundIt is not a part of <strong>the</strong> essence of a computer to understand what it is doing: everyacti<strong>on</strong> a computer performs must be explicitly anticipated and planned for. So, we arehappy to accept computers as obedient tools. However, for an increasingly largenumber of applicati<strong>on</strong>s we require systems that can, to some extent, decide for<strong>the</strong>mselves.A field where such applicati<strong>on</strong>s are becoming increasingly interesting is finance.There is a fast growing literature attempting to model financial interacti<strong>on</strong>s usingcomputer agents to go bey<strong>on</strong>d <strong>the</strong> restricti<strong>on</strong>s of analytical methods [LeBar<strong>on</strong>, 1998].At <strong>the</strong> same time analytical approaches are closely related to this development. Ac<strong>on</strong>tinuous interacti<strong>on</strong> between computati<strong>on</strong>al and analytical approaches is essentialto <strong>the</strong> progress <strong>with</strong>in <strong>the</strong> field.In <strong>the</strong> future financial markets might very well be important areas of applicati<strong>on</strong>for agent-based modelling. They offer features that make <strong>the</strong>m very appealing to thistype of modelling. One such feature, for example, is that financial data is readilyavailable at many different frequencies from annual to minute by minute. Naturally,<strong>the</strong>re are many hurdles too. Many empirical puzzles have been difficult for standardrepresentative agent models to explain. It is a new research area, so <strong>the</strong>re are stillmany questi<strong>on</strong>s that remain unanswered [ibid.].From a more practical point of view <strong>on</strong>e enticing questi<strong>on</strong> is whe<strong>the</strong>r agentscould beat humans as traders. There has been some research c<strong>on</strong>ducted <strong>on</strong> this issue,and results show that humans usually lose against computer agents [Chang, 2001].One great advantage computers have, vis-à-vis humans, is <strong>the</strong>ir speed. Computeragents can resp<strong>on</strong>d to slight changes in prices in a fracti<strong>on</strong> of a sec<strong>on</strong>d. In a testc<strong>on</strong>ducted by IBM, software-based robotic agents made seven per cent more cashthan people did. In <strong>the</strong>se tests both agents and people had <strong>the</strong> same set-up, allowing<strong>the</strong>m to trade through an unbiased software-based aucti<strong>on</strong>eer [Graham-Rowe, 2001].The aucti<strong>on</strong> simulated a market where buyers and sellers had a fixed amount of timeto trade in a single commodity. In o<strong>the</strong>r tests c<strong>on</strong>ducted by IBM, using doubleaucti<strong>on</strong>s (<strong>the</strong> same type of aucti<strong>on</strong>s stock markets use), agents were <strong>on</strong> average about5 per cent more profitable than people [Chang, 2001]. According to Dr. Steve R.White – head of this research at IBM – <strong>the</strong> agents excelled, although <strong>the</strong>y wereprogrammed <strong>with</strong> ra<strong>the</strong>r simple strategies. This was possible because <strong>the</strong>y couldquickly pounce <strong>on</strong> some<strong>on</strong>e else’s mistake, and because <strong>the</strong>y never made mindlessmistakes – selling something at a loss for example – something that humans tend todo at times. These scenarios may not be likely though in ‘thin’ markets, where <strong>the</strong>reare <strong>on</strong>ly a few buyers and sellers, and where good deals are a matter of skill ra<strong>the</strong>rthan speed.Al<strong>on</strong>g <strong>the</strong> above train of thought, <strong>the</strong>re is also research that shows that changes toshare prices do not accord well <strong>with</strong> <strong>the</strong> degree of reliability of <strong>the</strong> informati<strong>on</strong>reaching <strong>the</strong> stock market [Bloomfield et al, 2000], a c<strong>on</strong>diti<strong>on</strong> that agents couldimprove up<strong>on</strong>. Ano<strong>the</strong>r advantage computers might have, vis-à-vis humans, is that<strong>the</strong>y could be designed not to fall prey to ‘herd’ mentality and o<strong>the</strong>r psychologicalc<strong>on</strong>tingencies that to a greater or lesser degree rule humans. A closer look at marketstatistics has shown that <strong>the</strong> distributi<strong>on</strong> of <strong>the</strong> price return – <strong>the</strong> difference between<strong>the</strong> purchase and sale price of a share – is not ‘Gaussian’ but ‘power law’ – <strong>the</strong>1
ma<strong>the</strong>matical sign that all is not at random. This suggests that brokers follow herdinstincts [Haw, 2001]. This makes sense as brokers probably act <strong>on</strong> rumours thatspread across <strong>the</strong> markets. Similarly, big transacti<strong>on</strong>s <strong>on</strong> a stock market do create afollowing. Indeed, simulati<strong>on</strong>s <strong>with</strong> an artificial stock market, <strong>with</strong> computer agentsprogrammed to have ‘herd instincts’, have shown statistical similarities <strong>with</strong> <strong>the</strong>human counterpart [ibid.].There is also research that shows that <strong>the</strong> dynamics of share prices can bedescribed as a random walk [Osborne, 1977]. This c<strong>on</strong>tradicts what I have statedabove. Yet o<strong>the</strong>r reports show that <strong>the</strong>re are departures from a simple random walk instock prices. These anomalies are statistically significant, but practically <strong>the</strong>ir effectsare not large. From this follows that <strong>the</strong> effort I am making tests, in a way, how goodan approximati<strong>on</strong> <strong>the</strong> hypo<strong>the</strong>sis of stock prices as a random walk is, because it is in<strong>the</strong> departure from <strong>the</strong> random walk that profits can be found.With <strong>the</strong> above in mind, <strong>on</strong>e could envisi<strong>on</strong> scenarios where computer agentscould stabilize <strong>the</strong> stock market. Although <strong>the</strong> crash of <strong>the</strong> stock markets around <strong>the</strong>world in 1987 may have been induced or facilitated by computer aided investmentstrategies [Lux, 1995], computer agents could also, for example, have a stabilizingeffect if <strong>the</strong>y did not have <strong>the</strong> same herd instincts as human brokers. Humans andagents would interact in a complementary way that would make <strong>the</strong> stock market lessvolatile. In this way <strong>the</strong> danger of having to resort to more dramatic measures such ascircuit breakers, trade collars, etc. (measures to counteract extreme instability, forexample, a dramatic fall of an exchange) would be reduced. C<strong>on</strong>sidering <strong>the</strong>increasing volatility of <strong>the</strong> stock markets around <strong>the</strong> world in recent years this wouldbe a most welcome change for instituti<strong>on</strong>s running exchanges.1.2 ProblemAssignmentFor an instituti<strong>on</strong> running a marketplace high credibility of <strong>the</strong> marketplace is ofutmost importance. To achieve this end, <strong>the</strong>re are many different measures that comeinto c<strong>on</strong>siderati<strong>on</strong>. Am<strong>on</strong>g <strong>the</strong>se, measures that would decrease <strong>the</strong> transacti<strong>on</strong> costsare desirable [Wennerberg, 2001]. To decrease <strong>the</strong> transacti<strong>on</strong> costs, <strong>on</strong>e may, forexample, optimize <strong>the</strong> market structure, trading functi<strong>on</strong>ality, and fees. I have focused<strong>on</strong> issues c<strong>on</strong>cerning trading functi<strong>on</strong>ality.Specifically, <strong>the</strong> assignment given to me was to investigate future possibilities fortrading agents to be a part and functi<strong>on</strong>ality of <strong>the</strong> servers of a stock market. If <strong>the</strong>ylike, clients should be able to c<strong>on</strong>nect to a moderately customizable trading agent thattakes care of trading <strong>on</strong> <strong>the</strong>ir behalf according to <strong>the</strong>ir directi<strong>on</strong>s. These trading agentswould be running <strong>on</strong> a separate server <strong>with</strong> a direct link to <strong>the</strong> server that runs <strong>the</strong>matching service. In this way all <strong>the</strong> processes of c<strong>on</strong>tinuous evaluati<strong>on</strong> of c<strong>on</strong>diti<strong>on</strong>s,d<strong>on</strong>e by <strong>the</strong> agents, would not disturb <strong>the</strong> normal functi<strong>on</strong>ality of <strong>the</strong> matchingservice. The agents would have very fast and easy access to <strong>the</strong> matching service, as<strong>the</strong>y would not be hindered by slow c<strong>on</strong>necti<strong>on</strong>s <strong>on</strong> <strong>the</strong> Internet or heavy networkloads. Clients could place <strong>the</strong>ir orders <strong>with</strong> <strong>the</strong>se agents ra<strong>the</strong>r than placing <strong>the</strong>irorders, <strong>on</strong>e at a time, via a network. This would be particularly advantageous whenapproaching heavy load situati<strong>on</strong>s. With <strong>the</strong> right c<strong>on</strong>diti<strong>on</strong>s fulfilled, extreme levelsof messaging/network traffic could be reduced to levels at which measures such asthrottling (braking <strong>the</strong> matching service to avoid price update disseminati<strong>on</strong>c<strong>on</strong>gesti<strong>on</strong>) could be avoided. This, in turn, might increase <strong>the</strong> number of dealscompleted.2
- Page 1: Improved P
- Page 5 and 6: If the model corresponds well with
- Page 7 and 8: this purpose I designed a subagent
- Page 9 and 10: exceeded a threshold value, set bef
- Page 11 and 12: Chart 2, 26 Feb 99 - 25 Feb 00, sam
- Page 13 and 14: Chart 4, Average of 28 samples, 26
- Page 15 and 16: Table 4. The choice of subagent.The
- Page 17 and 18: separately (compare with results of
- Page 19 and 20: interesting to see how much time th
- Page 21 and 22: As I have already indicated in a pr
- Page 23 and 24: [Blume and Easley, 1990] L. Blume,
- Page 25 and 26: B, SHB-ATEL2-B,VOLV-B,WM-B, STE-R,A