11.07.2015 Views

Case study: LiDAR system provides helicopter pilots a clear line of ...

Case study: LiDAR system provides helicopter pilots a clear line of ...

Case study: LiDAR system provides helicopter pilots a clear line of ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Technology: 3D situational awareness<strong>Case</strong> <strong>study</strong>: <strong>LiDAR</strong> <strong>system</strong> <strong>provides</strong> <strong>helicopter</strong><strong>pilots</strong> a <strong>clear</strong> <strong>line</strong> <strong>of</strong> sight in brownoutsBy Maureen CampbellSingle Print OnlyPhoto courtesy <strong>of</strong> U.S. ArmyToday’s combat zones are inundated with threats. Enemyfire, Improvised Explosive Devices (IEDs), and mines area few <strong>of</strong> the imminent dangers that war fighters face. Onethreat that has become a major focus in recent yearsis vision obstruction in a <strong>helicopter</strong>’s landing zone, causedby brownouts and whiteouts. However, a new modified<strong>LiDAR</strong> <strong>system</strong> is providing 3D images to help increase<strong>pilots</strong>’ situational awareness.As the war on terrorism continues, <strong>helicopter</strong>s play a crucialrole in both combat and civilian missions, in everything frommedical evacuations to crew transportation. Due to the threat<strong>of</strong> IEDs, <strong>helicopter</strong>s are now becoming the preferred method <strong>of</strong>travel to and from mission coordinates. But in arid environmentslike Afghanistan, landing visibility is constantly obstructed bybrownout conditions.“Brownouts” occur when a <strong>helicopter</strong> takes <strong>of</strong>f or lands in sandor onto dust-covered sites, while snow-covered sites produce“whiteouts.” The spin <strong>of</strong> the <strong>helicopter</strong>’s rotors cause clouds <strong>of</strong>dust (or snow) particles to form in the air, obscuring the pilot’sview and, consequently, their situational awareness. When <strong>pilots</strong>do not have the visual cues they require to safely land the <strong>helicopter</strong>,the results can be fatal.In recent years, the U.S. Army has recorded more than 40 cases<strong>of</strong> brownout conditions causing accidents at various trainingfacilities within the U.S. In conjunction with in-theatre operations,this number jumps to 230 cases <strong>of</strong> aircraft damage and/or injury since 1991. Between 2001 and 2007, 80 percent <strong>of</strong>the accidents happened during landing procedures, while only20 percent occurred upon take<strong>of</strong>f. Brownouts are costing theU.S. an estimated $100 million per year.[1]Investigating solutions to this problem has become a highpriority for the military. Until recently, no definitive solution hasbeen developed. However, a new modified <strong>LiDAR</strong> vision <strong>system</strong>is emerging, the Obscurant Penetrating Autosynchronous<strong>LiDAR</strong>, known simply as OPAL. Its early prototypes have beentested in a variety <strong>of</strong> environments and have proven to be robustand powerful.OPAL’s ability to penetrate brownouts and whiteouts is provenmore effective than conventional <strong>LiDAR</strong>. When used in conjunctionwith an infrared camera and a terrain database in a <strong>system</strong>such as Augmented Visionic System (AVS), a powerful syntheticvision <strong>system</strong> is created for <strong>helicopter</strong> <strong>pilots</strong> – enabling a <strong>line</strong><strong>of</strong> sight through brownout and whiteout conditions and ensuringsafe take<strong>of</strong>fs and landings.OPAL versus conventional <strong>LiDAR</strong>Several <strong>system</strong>s exist that are aimed at improving <strong>helicopter</strong>pilot visibility. Although these sensors have some merit, they areinefficient in brownout and whiteout conditions. For example,infrared cameras are effective in poor visibility conditions, suchas fog, but are limited when operating in dust clouds holdingparticles <strong>of</strong> size comparable to the operational wavelength <strong>of</strong>the camera. Additionally, Millimeter-Wave (MMW) radar, flashLADAR, and Range-Gated Cameras are also ineffective withinbrownout and whiteout conditions. For example, because <strong>of</strong> itslonger wavelength, MMW radar penetrates deeply into a brownoutor whiteout but has poor spatial resolution, failing to <strong>clear</strong>ly


define the target image. The flash LADAR and the Range-GatedCamera work together to create a full Field-Of-View (FOV) inone laser shot pulse. Although this <strong>provides</strong> a high resolution, itlacks the ability to penetrate deep inside aerosol clouds becausethe light sources in these devices have to be spread into the FOVfor each shot pulse.Additionally, there are several <strong>system</strong>s on the market aimed atimproving <strong>helicopter</strong> pilot visibility. While various combinations<strong>of</strong> Infrared (IR) cameras with synthetic terrain databases provideeffective solutions to deal with environmental conditions and pollutions,they are not always efficient in brownouts and whiteouts.Akin to driving in a thick fog with high beams, when dust cloudscontain particles <strong>of</strong> a comparable size to that <strong>of</strong> the operationalwavelength <strong>of</strong> the camera, the vision <strong>system</strong> cannot penetrate theaerosol cloud to determine if there are any objects within it.Many active sensors, such as <strong>LiDAR</strong>s, have the ability to penetratefurther into brownouts and whiteouts than passive sensors.Emitting their own energy, <strong>LiDAR</strong>s use a laser shot pulse togather data. Using Time-Of-Flight (TOF) – that is, measuringthe time it takes to travel back and forth from the objects/targetsin its path – the image <strong>of</strong> a flightpath or a landing site is createdby gathering the sensor data and using the dataset to create a 3Dimage or model.Because conventional <strong>LiDAR</strong>s are triggered by the rising edge <strong>of</strong>a return pulse without a separated pulse from the target buried inan aerosol, this <strong>LiDAR</strong> can only report on the range <strong>of</strong> the closestaerosol under the brownout or whiteout situation. The OPALwas, therefore, developed specifically to address this problem.How the <strong>system</strong> worksOPAL <strong>of</strong>fers a higher signal-to-noise ratio than conventional<strong>LiDAR</strong>, resulting in a higher probability <strong>of</strong> detection and/ora greater range <strong>of</strong> depth capability. Its bistatic optical design<strong>provides</strong> robust results, and a proprietary design enables OPALto scan full FOV and acquire 3D data quickly – something traditionalactive sensors using bistatic designs have been incapable<strong>of</strong>. Using OPAL with an IR camera and a terrain database in a<strong>system</strong> like CAE’s AVS, <strong>pilots</strong> gain a powerful synthetic vision<strong>system</strong>.The terrain database must be pre-populated using data collectedby the AVS <strong>system</strong> on an initial scan, or using data purchasedfrom an organization dealing in cartography, or a combination<strong>of</strong> both. With the ability to accurately map a highly detailed andaccurate representation <strong>of</strong> the <strong>helicopter</strong>’s landing zone priorto and during descent into a brownout or whiteout, this <strong>system</strong>presents a stable heads-up/heads-down view <strong>of</strong> the world aroundthe aircraft to the pilot for optimal situational awareness duringhover, landing, and take<strong>of</strong>f situations.This is critical as the terrain database creates an accurate geospecificrepresentation <strong>of</strong> the world, and the information gatheredby OPAL and the IR camera is compared to the information containedwithin the database. Quite simply, the scanned areas areoverlaid onto the synthetic world and changes are quickly andeasily detected.This type <strong>of</strong> synthetic vision <strong>system</strong> <strong>provides</strong> the pilot with completeperception <strong>of</strong> the physical/geographic environment. Whencombined with a helmet mounted vision <strong>system</strong> in a heads-downdisplay, the pilot gains an almost infinite, instantaneous field <strong>of</strong>regard, independent <strong>of</strong> the current geographic conditions.Single Print OnlyTesting for <strong>helicopter</strong> <strong>system</strong>sInitial tests using the Aerosol Research Corridor at DefenceResearch and Development Canada (DRDC) in Valcartier, Quebec,involved comparing OPAL’s performance to that <strong>of</strong> a variety <strong>of</strong>passive sensors including the human eye (visible camera) and anIR camera.The aerosol chamber, which is a long, narrow building, hostedvisible and IR targets. The visible target was a board with blackand white stripes for the visible camera to focus on. The IR targetfor the IR camera was a frame with heated bars. The targetswere placed at the back <strong>of</strong> the chamber, and the doors were shutto disperse the aerosol within the chamber. The visible camera,IR camera, OPAL, and transmissometer were located about100 meters away from the chamber. When the doors opened, thesensors began gathering data within 0.5 seconds and continuedto collect data until the density <strong>of</strong> the aerosol cloud became toothin to gather further measurements.A detection factor – the defined parameter that is the ratio <strong>of</strong>aerosol density at the moment <strong>of</strong> target detection by the OPALor IR camera to the aerosol density at the moment <strong>of</strong> targetdetection by the visible camera – was used to compare OPAL’sperformance to the passive sensors.The OPAL penetrated farther than the eye and the IR camera undersand, dust, and fog aerosol conditions. In fact, the OPAL was ableto see through 50 micrometers <strong>of</strong> sand dust that is 4 times denserthan what the eye penetrates, through 6 micrometers <strong>of</strong> dustwhich is 6.6 times denser than what the eye can penetrate, andthrough fog that is more than 7.6 times denser. This is attributed toOPAL’s timing discrimination, which reduces the scattering effect.Nevertheless, the results demonstrated that the OPAL can be usedto detect obstacles in brownout conditions more effectively thanthe other options tested.To further evaluate the <strong>system</strong> prototype under regular andwhiteout conditions, a test flight was carried out in February2007 at Crash Lake, just north <strong>of</strong> Ottawa, Canada. The AVS<strong>system</strong> was mounted under the National Research Council’sCanada Bell 412 <strong>helicopter</strong>, which then flew to a landing siteconsisting <strong>of</strong> an open field. The center <strong>of</strong> the field featured anarea with bushes and rocks, while the field was surroundedwith trees on the one side and a flat frozen lake on the other.The <strong>system</strong> was used to scan the landing site under different<strong>helicopter</strong> maneuvers: A push-broom scan was used in fly-byor approaching operation, and a raster scan mode was used inhovering operation.To create a whiteout condition, the <strong>helicopter</strong> hovered close tothe ground in order to generate snow clouds with its rotor motion.In this test, the OPAL, developed by Neptec Design Group, andAVS provided outstanding results. The ground and trees behind


Technology: 3D situational awarenesssnow clouds are <strong>clear</strong>ly visible to <strong>pilots</strong> as shown in Figure 1.The bottom part <strong>of</strong> Figure 1 shows the side view <strong>of</strong> the 3D OPALimage under whiteout conditions.The results <strong>of</strong> the pushbroom scan <strong>clear</strong>ly illustrate the 3Dlandscape <strong>of</strong> the fly-by path and the tree and obstacles on thelanding site (Figure 2). This demonstrated the OPAL’s ability tomechanically sustain the vibration during flight while integratedas the active sensor <strong>of</strong> the AVS <strong>system</strong>. It also indicated that theOPAL data and the <strong>helicopter</strong> navigation data can be sucessfullyfused together to generate geo-referenced 3D images.Gaining a <strong>clear</strong>er perspectiveA very specific application, the OPAL/AVS <strong>system</strong> was developedexplicitly to combat the detrimental effects <strong>of</strong> brownout and whiteoutconditions. OPAL’s unique capabilities provide <strong>pilots</strong> a <strong>clear</strong>view <strong>of</strong> what is in their landing zone. The visibility and situationalawareness is critical in terms <strong>of</strong> landing or taking <strong>of</strong>f, therebypreventing accidents that can sometimes prove deadly.Maureen Campbell is the technicalmarketing specialist at Neptec DesignGroup. With more than 10 years <strong>of</strong> experiencein the technology industry, she workswith the research and development team atNeptec. Maureen has been with the companyfor three years. She can be reached atmcampbell@neptec.com.Figure 1Neptec613-599-7602 • www.neptec.comReferences:1. “Owning the Aviation Edge: NVGPID: A Simple Device to TrainCrucial Skills,” Army Aviation, United States Army, 2007-04, p. 22.Figure 2© 2008 OpenSystems Publishing. Not Licensed for distribution. Visit open<strong>system</strong>s-publishing.com/reprints for copyright permissions.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!