Submitted version of the thesis - Airlab, the Artificial Intelligence ...
Submitted version of the thesis - Airlab, the Artificial Intelligence ...
Submitted version of the thesis - Airlab, the Artificial Intelligence ...
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
Chapter 6<br />
Game<br />
The robot will be used as a testbed to develop robogames. As a testbed a<br />
simple game is implemented, using almost all <strong>the</strong> components available on<br />
<strong>the</strong> robot. The game is used also to verify <strong>the</strong> correctness <strong>of</strong> <strong>the</strong> solutions<br />
implemented. We created a simple game to test <strong>the</strong> robot. Basically, <strong>the</strong><br />
robot has to go to several targets in a sequence, by avoiding obstacles, until<br />
<strong>the</strong> final target is acquired.<br />
The game flow can be expressed in a better way with an algorithmic<br />
approach. The class diagram is shown in Appendix A.1 and <strong>the</strong> flow diagram<br />
in Appendix A.2. The microcontroller starts by controlling <strong>the</strong> game<br />
end status. The game end status is composed <strong>of</strong> conditions. First, it checks<br />
whe<strong>the</strong>r <strong>the</strong> target is acquired. If it is acquired <strong>the</strong>n it also checks whe<strong>the</strong>r<br />
this is <strong>the</strong> last target and ends <strong>the</strong> game by staying at <strong>the</strong> current position.<br />
If it is not <strong>the</strong> last target, <strong>the</strong> algorithm steps from <strong>the</strong> game status check<br />
phase and continues <strong>the</strong> search. Before performing any o<strong>the</strong>r operation <strong>the</strong><br />
collision status is controlled. If a collision is detected, a proper command is<br />
sent to motors to get rid <strong>of</strong> <strong>the</strong> obstacle.<br />
The next step is blob search, and blob tracing. The camera searches for<br />
<strong>the</strong> target at its vision side. If no blob is found, <strong>the</strong> robot performs a turn<br />
around its center <strong>of</strong> rotation until a blob is found or a collision detected.<br />
Normally, <strong>the</strong> collision should not be controlled for <strong>the</strong> turn around <strong>the</strong><br />
center <strong>of</strong> rotation but, as we discussed previously, we cannot guarantee <strong>the</strong><br />
correctness <strong>of</strong> <strong>the</strong> movement since we lack encoders for <strong>the</strong> motors. When a<br />
blob is found, <strong>the</strong> first step is checking <strong>the</strong> color information <strong>of</strong> blob, since<br />
<strong>the</strong> difference between targets is made by <strong>the</strong> color information. If <strong>the</strong> cor-