13.07.2015 Views

le travail à l'écran de visualisation work with display units

le travail à l'écran de visualisation work with display units

le travail à l'écran de visualisation work with display units

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

PS 2 66EYE-CATCHER: A LOW-COST EYE-TRACKER FOR RESEARCH ON HUMAN COMPUTERINTERACTIONHâns Stolk, Kasper Boon, Mark Smul<strong>de</strong>rs, Laurens Jan Ribbens,Educational Engineering Group, Open University, The NetherlandsThe aim of the EYE-CATCHER project is to <strong>de</strong>velop a low-cost eye-tracker,suitab<strong>le</strong> for research on human computer interaction in normal <strong>work</strong>ingsituations.Commercially availab<strong>le</strong> eye-movement equipment has a number ofdisadvantages which makes it <strong>le</strong>ss suitab<strong>le</strong> for research on humancomputerinteraction.Existing equipment is often:-mentally <strong>de</strong>manding-restricted to application in a laboratory situation-difficult to operate-expensive (>$50.000)-not meant specifically for research on human-computer interactionThe aim of this study was to investigate the feasibility of an eyetrackerthat does not have these disadvantages and which is suitab<strong>le</strong> forresearch on human-computer interaction of subjects <strong>work</strong>ing <strong>with</strong> a VDU innormal situations.Newly availab<strong>le</strong> vi<strong>de</strong>o-processors and infrared sensitive CCD-camera'senab<strong>le</strong> the construction of a low-cost eye-tracker which does notinfluence the subject and has an accuracy of at <strong>le</strong>ast one character on astandard VDU screen. User centered software <strong>de</strong>sign helps to make theequipment easy to operate.A prototype was <strong>de</strong>veloped using an infrared-sensitive CCD-vi<strong>de</strong>ocameraand a special optical system. Multip<strong>le</strong> infrared sources were used toobtain higher accuracy and faster performance in comparison to similarsystems <strong>with</strong> a sing<strong>le</strong> infrared source. The vi<strong>de</strong>o-signal is processed bymeans of a 68000-based vi<strong>de</strong>oprocessor, which was <strong>de</strong>veloped for thisapplication.PS2 67PROGRESS IN THE VDU AS A PROCESS OF MUTUAL ADAPTATION OF MEN ANDINFORMATION SYSTEMSDr. Prof. Valéry F. Venda, Head Dept of Learning, ResearchInstitute for Higher Education, USSR.Development of any system is a process of mutual adaptation between innercomponent of the system and between the system and environment. This is the firstlaw in the set of laws of mutual adaptation and transformation of systems proposed bythe author at the IXth Congress of IEA. It is necessary to use more concrete law forsynthesis and <strong>de</strong>velopment of man-machine systems: <strong>de</strong>velopment of man-computersystem is a process of mutual anticipating multi<strong>le</strong>vel adaptation between man andcomputer and between man-computer system and environment.Forecasting adaptation means <strong>de</strong>signing hardware and software of the man-computersystem <strong>with</strong> mo<strong>de</strong>ling its future tasks and environment <strong>with</strong> special multicyc<strong>le</strong> imitationand graphic mo<strong>de</strong>l - the Quadrigram proposed at the Xth Congress of IEA and wi<strong>de</strong>lyused in our research and practical <strong>de</strong>sign.Multi<strong>le</strong>vel adaptation means increasing efficiency of man-computer system bysequentially used <strong>le</strong>vels: total, contigent, functional, individual and individualoperative.The fundamentals of methodology of man-machine information Interactionwas proposed by the author in the p<strong>le</strong>nary paper at the 21-st Annual Conference of theUS, HFS and based on research by T.B. Pew, A. Chapanis, F. Klinx and others.Multi<strong>le</strong>vel adaptation of man to machine means the unity of professional se<strong>le</strong>ction andspecial training. We taught the operators of nuc<strong>le</strong>ar power plants in transformation ofstrategies of mental <strong>work</strong> and control and operative creation of principally newstrategies <strong>with</strong> the use of Transformation <strong>le</strong>arning theory and its methods. Mutualforecasting multi<strong>le</strong>vel and transformation approach to the progress in VDU <strong>work</strong>placesis also promising for the use in the field of prediction, <strong>de</strong>sign and training including thewi<strong>de</strong>sca<strong>le</strong> sociotechnical systems of Hybrid Intelligence.KEYWORDS: mutual multi<strong>le</strong>vel adaptation, Transformation <strong>le</strong>arning theory, HybridIntelligence.For research on the cognitive mo<strong>de</strong>lling of users, the stimulus field, aspresented on a VDU screen, can be divi<strong>de</strong>d into a number of subfields oras we call them "fields of interest". Software has been <strong>de</strong>veloped whichlinks these fields of interest to the eye-movements, <strong>de</strong>termining thesuccessive fields of interest, the fixation durations and a transitionmatrix <strong>with</strong> respect to the fields of interest.Results will be given on the performance of the system. It will be shownthat by means of a relatively simp<strong>le</strong> center-of-gravity algorithm realtimeprocessing of eye-movements is possib<strong>le</strong>. The usability of thesystem for human-computer interaction research will be illustrated forsimp<strong>le</strong> stimulus fields.KEYWORDS: human-computer interaction, cognitive mo<strong>de</strong>ls, eye - movements.TEV 1989 — DEUXIÈME CONFÉRENCE SCIENTIFIQUE INTERNATIONALE • MONTRÉAL • SECOND INTERNATIONAL SCIENTIFIC CONFERENCE — WWDU 198982TEV 1989 — DEUXIÈME CONFÉRENCE SCIENTIFIQUE INTERNATIONALE • MONTRÉAL • SECOND INTERNATIONAL SCIENTIFIC CONFERENCE — WWDU 198983

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!