12.07.2015 Views

Latest trends on systems - Wseas.us

Latest trends on systems - Wseas.us

Latest trends on systems - Wseas.us

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

LATEST TRENDS <strong>on</strong> SYSTEMS (Volume I)Plenary Lecture 4Work Directi<strong>on</strong>s and New Results in Electr<strong>on</strong>ic Travel Aids for Blind and VisuallyImpaired PeopleProfessor Virgil Tip<strong>on</strong>utElectr<strong>on</strong>ic and Telecommunicati<strong>on</strong> FacultyPOLITEHNICA University of TimisoaraRomaniaE-mail: virgil.tip<strong>on</strong>ut@etc.upt.roAbstract: There are approximately 45 milli<strong>on</strong> blind & visually impaired people world-wide according to the WorldHealth Report. Visi<strong>on</strong> loss limits the access of these individuals to the educati<strong>on</strong>al opportunities, social events, publictransportati<strong>on</strong> and leads to a higher rate of unemployment.Many efforts have been invested in the last years, based <strong>on</strong> sensor technology and signal processing, to developelectr<strong>on</strong>ic travel aids (ETA) capable to improve the mobility of blind <strong>us</strong>ers in unknown or dynamically changingenvir<strong>on</strong>ment. In spite of these efforts, the already proposed ETAs do not meet the requirements of the blindcommunity and the traditi<strong>on</strong>al tools (white cane and guiding dogs) are still the <strong>on</strong>ly <strong>us</strong>ed by visually impaired t<strong>on</strong>avigate in their working and living envir<strong>on</strong>ment.In this paper, research efforts to improve the main two comp<strong>on</strong>ents of an ETA tool: the Obstacles Detecti<strong>on</strong> System(ODS) and the Man-machine Interface (MMI) are presented. Now, for the first time, the ODS under development isbioinspired from the visual system of insects, particularly from the Lobula Giant Moti<strong>on</strong> Detector (LGMD) found inloc<strong>us</strong>ts. LGMD is a large neur<strong>on</strong> found in optical lobule of the loc<strong>us</strong>t, which mainly resp<strong>on</strong>ds at the approachingobjects. Starting from the mathematical model of the LGMD, known in the literature, it has been developed an ODSthat can be <strong>us</strong>ed by visually impaired to navigate aut<strong>on</strong>omo<strong>us</strong>ly with obstacles avoidance. The already obtainedresults are very promising, but some improvements are also possible. We are developing now preprocessingalgorithms for the visual informati<strong>on</strong> applied to the input of the LGMD neur<strong>on</strong>, in order to improve the resp<strong>on</strong>se of theODS. In the proposed soluti<strong>on</strong>, the positi<strong>on</strong> of the detected obstacles is correlated with the attitude parameters of thesubject's head. In this way, the visually impaired pers<strong>on</strong> detects obstacles in a similar way as a subject with normalsight is looking for obstacles in fr<strong>on</strong>t of him.The man-machine interface developed in the present research exploits the remarkable abilities of the human hearingsystem in identifying sound source positi<strong>on</strong>s in 3D space. The proposed soluti<strong>on</strong> relies <strong>on</strong> the Aco<strong>us</strong>tic Virtual Reality(AVR) c<strong>on</strong>cept, which can be c<strong>on</strong>sidered as a substitute for the lost sight of blind and visually impaired individuals.According to the AVR c<strong>on</strong>cept, the presence of obstacles in the surrounding envir<strong>on</strong>ment and the path to the targetwill be signalized to the subject by burst of sounds, whose virtual source positi<strong>on</strong> suggests the positi<strong>on</strong> of the realobstacles and the directi<strong>on</strong> of movement, respectively. The practical implementati<strong>on</strong> of this method encounters somedifficulties due to the Head Related Transfer Functi<strong>on</strong>s (HRTF) which should be known for each individual and for alimited number of points in the 3D space. These functi<strong>on</strong>s can be determined <strong>us</strong>ing a quite complex procedure, whichrequires many experimental measurements. The proposed soluti<strong>on</strong> in our research avoids these difficulties bygenerating the HRTF's coefficients <strong>us</strong>ing an Artificial Neural Network (ANN). The ANN has been trained <strong>us</strong>ing apublic data base, available for the whole scientific community and which c<strong>on</strong>tains HRTF's coefficients for a limitednumber of individuals and a limited number of points in 3D space for each individual.The ODS and the MMI presented in the above have been implemented <strong>on</strong> a specific hardware build around an ARMbasedmicroc<strong>on</strong>troller system. The obtained results and some c<strong>on</strong>cl<strong>us</strong>i<strong>on</strong>s are also presented.Brief Biography of the Speaker:Prof. Virgil TIPONUT received the M.Sc. in 1968, in Electrical Engineering/Computer Science, and the Ph.D. degreein Electr<strong>on</strong>ic Engineering and Telecommunicati<strong>on</strong>s, in 1981, both at the POLITEHNICA University of Timisoara,Romania. Since graduati<strong>on</strong> he is with POLITEHNICA University of Timisoara and curently he is a professor atElectr<strong>on</strong>ic and Telecommunicati<strong>on</strong> Faculty, resp<strong>on</strong>sable for teaching in embedded <strong>systems</strong>, smart transducers andneural networks.His research interests include bioinspired <strong>systems</strong>, with applicati<strong>on</strong> in mobile and rehabilitati<strong>on</strong> robotics and someclosed related areas: smart transducers, neural networks and fuzzy logic, biomedical engineering, embeddedISSN: 1792-4235 22 ISBN: 978-960-474-199-1

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!