10.07.2015 Views

SSS: A Hybrid Architecture Applied to Robot Navigation - CiteSeerX

SSS: A Hybrid Architecture Applied to Robot Navigation - CiteSeerX

SSS: A Hybrid Architecture Applied to Robot Navigation - CiteSeerX

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

ehavior-based systems discretize the possible states of theworld in<strong>to</strong> a small number of special task-dependentcategories. Symbolic systems take this one step further andalso discretize time on the basis of significant events. Theycommonly use terms such as Òafter X do Y" and ÒperformA until B happens". Since we create temporal events onthe basis of changes in spatial situations, it does not makesense for us <strong>to</strong> discretize time before space. For the samereason, we do not include a fourth layer in which space iscontinuous but time is discrete.In order <strong>to</strong> use these three fairly different technologies wemust design effective interfaces between them. The firstinterface is the command transformation between thebehavior-based layer and the underlying servos.Subsumption-style controllers typically act by adjustingthe setpoints of the servo-loops, such as the wheel speedcontroller, <strong>to</strong> one of a few values. All relevant PIDcalculations and trapezoidal profile generation are thenperformed transparently by the underlying servo system.The sensory interface from a signal-processing front-end<strong>to</strong> the subsumption controller is a little more involved. Aproductive way <strong>to</strong> view this interpretation process is in thecontext of Òmatched filters" [16, 6]. The idea here is that,for a particular task, certain classes of sensory states areequivalent since they call for the same mo<strong>to</strong>r response bythe robot. There are typically some key features that, forthe limited range of experiences the robot is likely <strong>to</strong>encounter, adequately discriminate the relevant situationsfrom all others. Such Òmatched filter" recognizers are themechanism by which spatial parsing occurs.The command interface between the symbolic andsubsumption layers consists of the ability <strong>to</strong> turn eachbehavior on or off selectively [7], and <strong>to</strong> parameterizecertain modules. These event-like commands are Òlatched"and continue <strong>to</strong> remain in effect without requiring constantrenewal by the symbolic system.The sensor interface between the behavior-based layer andthe symbolic layer is accomplished by a mechanism whichlooks for the first instant in which various situationrecognizers are all valid. For instance, when the robot hasnot yet reached its goal but notices that it has not beenable <strong>to</strong> make progress recently, this generates a Òpathblocked"event for the symbolic layer. To help decouplethe symbolic system from real-time demands we have addeda structure called the Òcontingency table". This table allowsthe symbolic system <strong>to</strong> pre-compile what actions <strong>to</strong> takewhen certain events occur, much as baseball outfieldersyell <strong>to</strong> each other Òthe play's <strong>to</strong> second" before a pitch. Theentries in this table reflect what the symbolic systemexpects <strong>to</strong> occur and each embodies a one-step plan forcoping with the actual outcome.Figure 2 - TJ has a 3-wheeled omni-directional base and usesboth sonar ranging and infrared proximity detection fornavigation. All servo-loops, subsumption modules, and thecontingency table are on-board. The robot receives pathsegment commands over a spread-spectrum radio link.2: The navigation taskThe task for our robot (shown in figure 2) is <strong>to</strong> map acollection of corridors and doorways and then rapidlynavigate from one office in the building <strong>to</strong> another. Toaccomplish this we had <strong>to</strong> address two basic navigationproblems. The first of these is compensating for thevariability of the environment. There are often people inthe halls, doors open and close by random amounts, somedays there are large stacks of boxes, and the trash cans arealways moving around.We solve the variability problem by restricting ourselves<strong>to</strong> very coarse geometric maps. We record only the distanceand orientations between relevant intersections. The actualdetails of the path between two points are never recorded,thus there are never any details <strong>to</strong> correct. However, <strong>to</strong>make use of such a coarse map the robot must be able <strong>to</strong>reliably follow the segments so described [4]. Fortunately,behavior-based systems are particularly good at this sort oflocal navigation.The other basic navigation problem is knowing whenthe robot has arrived back in a place it has been before. Thedifficulty is that, using standard robot sensory systems, anindividual office or particular intersection is notdistinguishable from any other. One approach is <strong>to</strong> useodometry <strong>to</strong> determine the absolute position of the robot.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!