12.07.2015 Views

Download PDF - Department of Navy Chief Information Officer - U.S. ...

Download PDF - Department of Navy Chief Information Officer - U.S. ...

Download PDF - Department of Navy Chief Information Officer - U.S. ...

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Using a biological model on the robot,the scanning laser is the analogy to subconsciousperception, precisely surveyingthe robot’s surroundings to build a mapand keep the system geographically referenced.The robot’s video camera canthen be cued to “consciously” investigateany anomalies detected by the laser suchas an opening in a nearby wall. Is it a doorway,a window, bomb damage?Greg Kogut, SSC Pacific unmannedsystems branch’s vision expert, summedup the vision concept: “Instead <strong>of</strong> constantlytrying to classify everything inthe scene across an infinite spectrum <strong>of</strong>possibilities, vision is now given discretetasks bounded in terms <strong>of</strong> both scopeand location. The results have been quiteimpressive.”If the vision system identifies a laserdetectedwall opening as a doorway, forexample, it next looks to either side <strong>of</strong>that feature to see if there is an associatedroom sign. If such a sign is found,the camera lens zooms in to read it andlinks any relevant descriptive information,such as room number, purpose or occupant,along with the “X, Y” coordinates <strong>of</strong>the door opening. Each <strong>of</strong> these scriptedtasks is both focused in objective andphysically constrained to the appropriateportion <strong>of</strong> the overall field <strong>of</strong> view, thussignificantly reducing complexity.The robot also learns important informationabout its environment with nohuman assistance that allows it to laterexecute high-level voice commands suchas “Go to room 102” or “Enter the generatorroom.”“… we were struck by a rathersobering observation, in that fordecades we had been trying toemulate human perception andintelligence on a robot, yet wereally had precious little insightinto either … I spent about a yearreading 30-plus books on bothsubjects, [and] then restructuredour algorithmic approachaccordingly.”– Bart EverettIn 2004, the ARMS team presented alandmark paper at the Society <strong>of</strong> Photo-Optical Instrumentation Engineers MobileRobots XVII Conference in Philadelphia.(SPIE is an international membershipsociety, serving scientists and engineers inindustry, academia and government. SPIEmembers work in a wide variety <strong>of</strong> fieldsthat utilize some aspect <strong>of</strong> optics and photonics,which is the science and application<strong>of</strong> light.)The concept, presented in the paper,“Towards a Warfighter’s Associate: Eliminatingthe Operator Control Unit,” envisionsthe proximal interaction <strong>of</strong> a human-robotteam, similar to the pairing <strong>of</strong>police <strong>of</strong>ficers and their canine partnersin law enforcement.One <strong>of</strong> the biggest challenges is findinga robust means <strong>of</strong> command and controlfor scenarios where the warfighterand the robot work side by side.“In looking back over the past few years,we have progressed from joystick controlto mouse control — and now even voicecontrol,” explained SSC Pacific’s GauravAhuja.Mr. Ahuja is working with a CCATawardee, Think-a-Move, Ltd., to integrateThink-a-Move’s patented earpiece intothe robot’s design. The earpiece capturesthe sound waves created in the ear canalwhen people speak.“But if the human-robot team walksinto an ambush,” Ahuja continued, “evenvoice control is not good enough. There isno time to talk to a robot in this situation[because] the warfighter must instinctivelyreact to ensure his or her survival.”For the robot to provide any value inthis scenario, it must similarly infer whatit should do based on what it sees thehuman doing and the perceived threatlevel in the surrounding environment.According to SSC Pacific project engineerDonnie Fellars, “A good analogy herewould be a hunter and a bird dog. Thedog knows what to do during the hunt bywatching the hunter, and changes modeswithout direction to find the game, pointout the game, and then retrieve thegame. We want the robot to follow thatsame model.”“What we’re trying to do is give therobot some degree <strong>of</strong> artificial empathy,which is a tall order,” Everett added. “It’shard enough to emulate human behavior,and animals are far more adept at readingbody language and judging intent thanhumans.”To get around this problem, the teamtapped information similar to the datacollected by the Warfighter PhysiologicalStatus Monitor (WPSM) which is beingdeveloped under the Army’s Future ForceWarrior program. WPSM is a wearablephysiological sensor suit that monitorsbody temperature, heart rate, blood pressure,hydration, stress levels and bodyposition. An Airs<strong>of</strong>t M4 serves as a laboratorysurrogate for the Soldier’s weapon,instrumented to indicate orientation aswell as safety and trigger status. (Airs<strong>of</strong>tguns resemble real guns but shoot smallplastic BBs.)Passing all this valuable informationto the robot enables some fairly sophisticatedhuman-robot interaction with noadditional control requirements imposedupon the warfighter.In one potential scenario, for example,the robot automatically follows or precedesa dismounted Soldier using a combination<strong>of</strong> ultrasonic, laser and videosensors. If the human stops, the robotalso halts and begins using its video camerato search for potential threats. Shouldthe warfighter unlock the safety on his orher weapon, the robot redirects its gazein the direction the weapon is pointing.If a firefight ensues, the robot can repositionitself between the warfighter and thenearest detected threat.All these supporting behaviors areseamlessly invoked in response to changesassociated with the threat environmentallowing the human partner to put farmore focus on survival. This revolutionarynew approach to human-robotic interactionshould facilitate a multitude <strong>of</strong> missioncapabilities now considered impracticaldue to the control burden imposedby current tele-operated systems.While much work remains to be donebefore the “Warfighter’s Associate” conceptis formally vetted for operationaluse, initial reaction from the user communityhas been very favorable.Ann Dakis works in the SSC Pacific public affairs<strong>of</strong>fice.CHIPS October – December 2008 33

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!