02.09.2015 Views

→ INTERACT SPACE EXPERIMENT

Interact Brochure - Online

Interact Brochure - Online

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>→</strong> <strong>INTERACT</strong><br />

<strong>SPACE</strong> <strong>EXPERIMENT</strong><br />

Online Fact Sheet<br />

telerobotics and<br />

haptics laboratory


<strong>→</strong> BACKGROUND<br />

Interactive robotics demonstration from<br />

on-board the ISS<br />

In early September this year, Danish astronaut Andreas Mogensen will perform a<br />

groundbreaking space experiment called Interact, developed by ESA in close<br />

collaboration with the TU Delft Robotics Institute. During the 2015 ESA Short<br />

Duration Mission, Mogensen will take control of the Interact Centaur rover on Earth<br />

from the International Space Station in real-time with force-feedback.<br />

The date of the activity has currently been planned for Monday the 7th of<br />

September, but is subjected to change dependent on the ISS activity schedule.<br />

The Mission<br />

The Interact experiment, conceived and implemented by the ESA Telerobotics &<br />

Haptics Laboratory, will be the first demonstration of teleoperation of a rover from<br />

space to ground in which during part of the experiment, the operator will receive<br />

force-feedback during control. The task set up for the astronaut is to maneuver the<br />

rover located at ESA’s ESTEC technical center in Noordwijk through a special obstacle<br />

course, to locate a mechanical task board and to perform a mechanical assembly<br />

task. Once the task board is located and approached, the astronaut will use a<br />

specially designed haptic joystick in space to take control of one of the Centaur’s<br />

robotic arms on Earth. With the arm he will execute a “peg-in-hole” assembly task<br />

to demonstrate the ability to perform connector mating through teleoperation with<br />

tight mechanical tolerances of far below one millimeter. The haptic feedback allows<br />

the astronaut to actually feel whether the connector is correctly inserted and, if<br />

necessary to fine-tune the insertion angle & alignment. The complete operation is<br />

performed from on-board the International Space Station, at approximately 400 km<br />

altitude, using a data connection via a geosynchronous satellite constellation at<br />

36.000 km altitude. The communication between the haptic joystick and the<br />

ground system is bi-directional, where both systems are essentially coupled. This socalled<br />

bi-lateral system is particularly sensitive to time delay, which can cause<br />

instability. The satellite connection, called the Tracking and Data Relay Satellite<br />

System (TDRSS), results in communication time delays as large as 0.8 seconds,<br />

which makes this experiment especially challenging. ESA copes with these<br />

challenges through specialized control algorithms developed at ESA’s Telerobotics<br />

Laboratory, through augmented graphical user interfaces with predictive displays<br />

and with ‘force sensitive’ robotic control algorithms on ground. These ESA<br />

technologies allow the operator to work in real-time from space on a planetary<br />

surface. It is as if the astronaut could extend his arm from space to ground.<br />

ESA TELEROBOTICS LAB<br />

Noordwijk, Netherlands<br />

www.esa-telerobotics.net


<strong>→</strong> THE ASTRONAUT<br />

Astronaut Andreas Mogensen<br />

Set to launch to the International Space Station on the 2nd of September, Danish<br />

ESA astronaut Andreas Mogensen is a preparing for a short-duration mission of a<br />

maximum of 10-days. Andreas has a background as an aerospace engineer and has<br />

familiarized himself with the technology at ESA’s Telerobotics Laboratory.<br />

Andreas can be followed by visiting andreasmogensen.esa.int


<strong>→</strong> THE TEAM<br />

ESA Telerobotics & Haptics Laboratory<br />

The Interact Experiment was conceived and developed by ESA’s Directorate of<br />

Technical and Quality Management, in particular, within ESA’s Telerobotics &<br />

Haptics Laboratory and in collaboration with the TU Delft Robotics Institute<br />

The Interact experiment is supported by the ESA Human Spaceflight and<br />

Exploration Directorate, in particular by its ISS Programme and Exploration<br />

Department.<br />

The ESA Telerobotics & Haptics Lab consists of a small but highly dynamic team of<br />

engineers and engineering academics. Led by Dr. André Schiele, Associate Professor<br />

at the Delft University of Technology, the team performs fundamental research in<br />

mechatronics, robotics and control theory. The Laboratory hosts several ESA staff<br />

members, research contractors and a varying number of Ph.D. and M.Sc. candidates<br />

supported via the Delft University of Technology.<br />

The Interact Centaur design was created in close collaboration with a team of<br />

Industrial Design Master Students from TU Delft in 2014.<br />

Follow the ESA Telerobotics & Haptics Lab by visiting esa-telerobotics.net


<strong>→</strong> TECHNICAL<br />

FEATURES


Technical Features<br />

<strong>→</strong> <strong>INTERACT</strong> CENTAUR<br />

The mobile robotic platform called the Interact Centaur was specifically designed to<br />

be able to maneuver through rough terrain at high speeds and to have the dexterity<br />

to perform very delicate and precise manipulation tasks through remote control.<br />

The custom vehicle design was brought from concept to reality in little over a year.<br />

ROBOTIC PAN-AND-TILT NECK AND HEAD<br />

A robotic 6 degrees of freedom Neck gives the<br />

cameras in the rover’s head an enormous field of<br />

view, good for driving and for close visual<br />

inspection tasks.<br />

REAL-TIME CAMERAS<br />

The rover has 4 dedicated real-time streaming<br />

cameras that the astronaut can use during the<br />

mission. A head pan-tilt camera that will allow<br />

general contextual overview of the situation<br />

during driving and exploration of the<br />

environment. A tool camera mounted on the<br />

right robotic arm for vision during precise tool<br />

manipulation. Two hazard cameras (front and<br />

back) to view the near proximity area otherwise<br />

occluded by the chassis during driving.<br />

COMPUTING<br />

The robot makes use of seven high performance<br />

computers running software that has been<br />

programmed in a highly modular, model-based<br />

and distributed way.<br />

EXTERIOR DESIGN<br />

A custom-made exterior protects all delicate<br />

mechatronic and computing hardware from dust<br />

and ensures a good thermal design.<br />

ROBOTIC ARMS<br />

Two KUKA lightweight robotic arms on the front<br />

of the rover allow the operator to perform very<br />

precise manipulation tasks. The arms can be<br />

‘soft controlled’ to safely interact with humans<br />

or delicate structures and can be programmed to<br />

be compliant (like a spring and or damper) when<br />

they hit an object. The arms are equipped with<br />

highly ‘force sensitive’ sensors and can flex and<br />

adapt in a similar manner to human arms during<br />

remote control. This allows to tightly couple<br />

those arms to an operator located far away by<br />

means of haptic (i.e. force-transmitting)<br />

interfaces. Their operation during the Interact<br />

experiment is very intuitive, allowing delicate<br />

and dexterous remote operations to take place<br />

across very long distances with the finest<br />

amount of force feedback to the operator despite<br />

the communication time delay.<br />

ROVER MOBILE PLATFORM<br />

The drivetrain and wheels for the Interact<br />

Centaur are a customized version of the remote<br />

controlled platform manufactured by AMBOT.<br />

This battery-powered, four-wheel-drive, fourwheel<br />

steering platform is weatherproof and<br />

gives the rover over 8 hours of run-time in<br />

challenging terrains.


<strong>→</strong> AUGMENTED REALITY<br />

Virtual model overlays in real-time<br />

To provide extra support to the astronaut while driving the rover, an augmented<br />

reality (AR) overlay was developed. This allows for virtual markers such as predicted<br />

position markers to be displayed on top of the camera feed.<br />

The current rover position is shown with two yellow blocks in front of the wheels.<br />

1.<br />

The current rover position is shown with two yellow blocks in front of the wheels.<br />

Similarly, white blocks indicate the predicted rover position. Before the rover moves the operator can see<br />

where the rover is going to end up.<br />

Green blocks are used to align the rover with the task board.<br />

2. 3.<br />

4. 5.


<strong>→</strong> LASER GUIDANCE<br />

Embedded lazer tool support<br />

To visually support the astronaut when performing the mechanical alignment<br />

during the peg-in-hole assembly task, a laser has been embedded within the tool.<br />

When hovering over the hole, the laser will be invisible indicating that the<br />

connection can be attempted. The Laser creates an artificial depth impression by a<br />

dedicated depth-cue. This allows executing such complex 3D tasks without<br />

requiring a dedicated stereo 3D video system, which would consume excessive data<br />

bandwidth.<br />

*<br />

*


<strong>→</strong> <strong>SPACE</strong> TO GROUND<br />

Satellite communications<br />

Tracking and Data Relay Satellite System (TDRSS)<br />

As a complicating factor, the signals between the astronaut and the robot must<br />

travel via a dedicated and highly complex network of satellites in geo-synchronous<br />

orbit. The signals will travel from the International Space Station via NASA’s TDRSS<br />

to ground facilities in the U.S. From there, they cross the Atlantic Ocean to the ESA<br />

facilities in Noordwijk, the Netherlands. Forces between the robot and its<br />

environment, as well as video and status data, travels back to the graphical user<br />

interface and the haptic joystick. In this round-trip, all signals cover a distance of<br />

nearly 90.000 km. The resulting round trip time delay approaches one second in<br />

length.<br />

ESA developed a model-mediated control approach that allows to perform forcefeedback<br />

between distributed systems up to multiple seconds of time delay,<br />

without a noticeable reduction of performance, compared with directly coupled<br />

systems. Despite the fact that this smart software and control methods enable the<br />

astronaut to perform such tasks on Earth, research suggests that humans can only<br />

handle signal transmission time delays of up to about three seconds for control<br />

tasks that require hand-eye coordination. In theory this would allow haptic control<br />

from Earth to robotic systems on as far away as the surface of our Moon.<br />

International Space Station (ISS)<br />

ESTEC<br />

Noordwijk, Netherlands<br />

NASA Ground Terminals<br />

New Mexico, USA<br />

90.000<br />

km


<strong>→</strong> HAPTICS-1 JOYSTICK<br />

Teleoperation of earthbound robotics with<br />

real-time force-feedback from Space<br />

On-board the ISS, the astronaut will re-use equipment from the previous<br />

Telerobotics & Haptics Lab experiments called Haptics-1 and Haptics-2. For these<br />

experiments a tablet PC and a small force reflective joystick were flown to the ISS<br />

with the goal to evaluate human haptic perception in space and to validate realtime<br />

telerobotic operations from space to ground. During Haptics-1, on the 30th of<br />

December 2014, haptics was first used in the microgravity environment of the ISS.<br />

During Haptics-2, on June 3rd (21:00 CEST) 2015, for the first time in history, a<br />

handshake with force-feedback was performed between two humans, one located<br />

in space and on ground.


telerobotics and<br />

haptics laboratory<br />

WITH <strong>INTERACT</strong>, ESA AIMS TO PRESENT AND VALIDATE<br />

A FUTURE WHERE HUMANS AND ROBOTS EXPLORE <strong>SPACE</strong><br />

TOGETHER. ROBOTS WILL PROVIDE THEIR OPERATORS MUCH<br />

WIDER SENSORY FEEDBACK OVER MUCH GREATER DISTANCES<br />

THAN WHAT CAN BE DONE BY TERRESTRIAL ROBOTS TODAY.<br />

NOT ONLY IN <strong>SPACE</strong>, BUT ALSO ON EARTH, REMOTE<br />

CONTROLLED ROBOTICS WILL PROVE HIGHLY ENABLING IN<br />

DANGEROUS AND INACCESSIBLE ENVIRONMENTS. THEY CAN<br />

BE USED IN ARCTIC CONDITIONS, IN THE DEEP SEA OR FOR<br />

ROBUST INTERVENTION IN NUCLEAR DISASTER SITES.<br />

WE CAN EXPECT THAT FUTURE HUMAN EXPLORATION<br />

MISSIONS TO THE MOON AND MARS WILL BENEFIT FROM<br />

SUCH ADVANCED HUMAN-ROBOTIC OPERATIONS. ESA’S<br />

RESEARCH IN TELEROBOTIC TECHNOLOGIES AND ADVANCED<br />

CREW OPERATIONS FROM ORBIT WILL PLAY A KEY ROLE<br />

IN THESE COMING ADVENTURES. THE ESA TELEROBOTICS<br />

AND HAPTICS LABORATORY, ALONG WITH ESA’S TECHNICAL<br />

AND <strong>SPACE</strong> EXPLORATION DIRECTORATE ARE DEDICATED<br />

TO TAKING THE NEXT BIG STEPS IN HUMAN-ROBOT<br />

COLLABORATION IN <strong>SPACE</strong>.<br />

ESA TELEROBOTICS & HAPTICS LABORATORY<br />

TU DELFT<br />

interact<br />

✦ MOGENSEN ✦<br />

ROBOTICS<br />

INSTITUTE

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!