Presenting the information in form of physical stimuli,rather than intellectual (textual and image basedinformation), allows the Fear Tuners wearer to focus thecenter of his or her attention on other things. The wearercan completely process Fear Tuners’ signals in thebackground of awareness. This form of ambientinformation presentation engages the senses and thus resultsinto a subtle, yet intense experience that does not disruptthe wearers daily routine [10].In the process of exploring suitable sensations, I wasinvestigating different actuators, such as solenoids andvibration motors, peltier pumps and electrical deep tissuestimulation aiming to create cold shivers (Figure 4), goosebumps, raised neck hair and hot stings. I also looked intopossibilities of exploiting the phenomenon ofsomatosensory illusions [9].I identified five key scenarios, Disasters, Financial, Health,Personal and Technology, in which Fear Tuners would actas an ‘artificial sixth sense’ in the form of a device.CONCLUSIONSAt present, Fear Tuners exist as a series of technicalexperiments, form prototypes, a video scenario and booklet.They were presented as part of my thesis at the RoyalCollege of Art graduation show. I am hoping to bring theproject to a next level, in which the preceding research andexperimentation in form and function would be combinedto create to a fully functional prototype. For this next step, Iam looking for collaboration partners from a differentbackground other than design.ACKNOWLEDGMENTSI thank Fiona Raby, Tony Dunne and James Auger whoguided and helped to develop the project at the RoyalCollege of Art and Carson Reynolds for his valuable adviceand inspiration.REFERENCES1. Aldersey, W., Briscoe, S., Panicology, Viking/Penguin,London, UK, pp. XIV-XVIII. 2008.2. Cannon, W. B., The wisdom of the body, New York,NY, Norton, 1932.3. Douglas, M., Wildavsky A., Risk and Culture,University of California Press, Berkeley, CA, USA,1983.4. Ekman, P., E. T. Rolls, D. I. Perrett and H. D. Ellis,Facial Expressions of Emotion: An Old Controversy andNew Findings [and Discussion], PhilosophicalTransactions: Biological Sciences, Vol. 335, No. 1273,Processing the Facial Image, pp. 63-69, 1992.5. Gardner, D., Risk, Virgin Books, London, UK, 2008.6. Gaver W., Dunne, T. & Pacenti, E., Cultural Probes,Interactions, pp. 24-25, 1999.7. Jansen, A. S., Nguyen, X. V., Karpitskiy, V.,Mettenleiter, T. C., and Loewy, A. D., Central commandneurons of the sympathetic nervous system: basis of thefight-or-flight response. Science (New York, N.Y.),270(5236): 644-646, 1995.8. Kusahara, M., Device Art: A New Approach inUnderstanding Japanese Contemporary Media Art. InMediaarthistories, ed. Oliver Grau. MIT Press, Boston,MA, USA, p. 288, 2007.9. Sherrick, C. E. & Rogers, R., Apparent hapticmovement, Perception & Psychophysics, Vol.1, pp.175-180, 1966.10.Wisneski, C., Ishii, H., Dahley, A., Gorbet, M., Brave,S., Ullmer, B., and Yarin, P, Ambient displays: Turningarchitectural space into an interface between people anddigital information. Volume 1370 of Lecture Notes onComputer Science, pages 22-32., 1998.16
Gesture recognition as ubiquitous input for mobile phonesGerrit Niezen Gerhard P. HanckeUniversity of PretoriaLynnwood Road, Pretoria,0002, South Africa{gniezen, g.hancke}@ieee.orgABSTRACTA ubiquitous input mechanism utilizing gesture recognitiontechniques on a mobile phone is presented. Possible applicationsusing readily available hardware are suggested andthe effects of a mobile gaming system on perception is discussed.Author Keywordsubiquitous computing, accelerometers, gesture recognition,optimization, human-computer interfacesACM Classification KeywordsB.4.2 Input/Output Devices, H5.m. Information interfacesand presentationINTRODUCTIONMobile phones are the most pervasive wearable computerscurrently available and have the capabilities to alter and manipulateour perceptions. They contain various sensors, suchas accelerometers and microphones, as well as actuators inthe form of vibro-tactile feedback. Visual feedback may beprovided through mobile screens or video eye wear.Dynamic input systems in the form of gesture recognitionare proving popular with users, with Nintendo’s Wii beingthe most prominent example of this new form of interaction,that allows users to become more engaged in video games[1]. The video game experience is now affected not only bytiming and pressing buttons, but also by body movement.To ensure a fast adoption rate of gesture recognition as anubiquitous input mechanism, technologies already availablein mobile phones should be utilized. Features like accelerometersensing and vibro-tactile feedback are readily availablein high-end mobile phones, and this should filter through tomost mobile phones in the future.Hand gestures are a powerful human-to-human communicationmodality [2], and the expressiveness of hand gesturesalso allows for the altering of perceptions in human-computerCopyright is held by the author/owner(s).<strong>UbiComp</strong> ’08 Workshop W1 – Devices that Alter Perception(DAP 2008)September 21st, 2008This position paper is not an official publication of Ubi-Comp ’08.interaction. Gesture recognition allows users to perceivetheir bodies as an input mechanism, without having to relyon the limited input capabilities of current mobile devices.Possible applications of gesture recognition as ubiquitous inputon a mobile phone include interacting with large publicdisplays or TVs (without requiring a separate workstation)as well as personal gaming with LCD video glasses.The ability to recognize gestures on a mobile device allowsfor new ways of remote social interaction between people.A multiplayer mobile game utilizing gestures would enableplayers to physically interact with one another without beingin the same location. Gesture recognition may be used as amobile exertion interface [3], a type of interface that deliberatelyrequires intensive physical effort. Exertion interfacesimprove social interaction, similar to games and sports thatfacilitate social interaction through physical exercise. Thismay change the way people perceive mobile gaming, as itnow improves social bonding and may improve overall wellbeingand quality of life.Visual, auditory and haptic information should be combinedin order to alter the user’s perceptions. By utilizing videoglasses as visual feedback, earphones as auditory feedbackand the mobile phone’s vibration mechanism as haptic feedback,a pervasive mobile system can be created to provide aubiquitous personal gaming experience. Gesture recognitionis considered as a natural way to interact with such a system.Gesture recognition algorithms have traditionally only beenimplemented in cases where ample system resources are available,i.e. on desktop computers with fast processors andlarge amounts of memory. In the cases where a gesturerecognition has been implemented on a resource-constraineddevice, only the simplest algorithms were considered andimplemented to recognize only a small set of gestures; forexample in [5], only three different gestures were recognized.We have developed an accelerometer-based gesture recognitiontechnique that can be implemented on a mobile phone.The gesture recognition algorithm was optimized such that itonly requires a small amount of the phone’s resources, in orderto be used as a user interface to a larger piece of software,or a video game, that will require the majority of the systemresources. Various gesture recognition algorithms currentlyin use were evaluated, after which the most suitable algorithmwas optimized in order to implement it on a mobilephone [6]. Gesture recognition techniques studied include17
- Page 3 and 4: We would also like to extend a spec
- Page 5 and 6: ∙ USE‐03 Clinical Proof‐of‐
- Page 7 and 8: Ubiquitous Sustainability: Citizen
- Page 9 and 10: Devices that Alter Perception (DAP
- Page 11 and 12: incorporated into the body; if it d
- Page 13 and 14: vise. Beginning improvisers typical
- Page 15 and 16: Location-based Social Networking Sy
- Page 17 and 18: PrivacyWith any social networking a
- Page 19 and 20: BOXED EGO INSTALLATIONA pair of cam
- Page 21 and 22: the natural egocentric visuospatial
- Page 23: HUMAN SENSES AND ABSTRACT DANGERSOu
- Page 27 and 28: een developed to overcome this. For
- Page 29 and 30: 16. Myvu Corporation. Myvu Crystal.
- Page 31 and 32: especially in the fields of Ambient
- Page 33 and 34: 14. G. Riva (Editor), F. Vatalaro (
- Page 35 and 36: system, reflex arcs, and even muscl
- Page 37 and 38: 5. R. Cytowic. Synesthesia: Phenome
- Page 39 and 40: RELATED WORKLet us consider the loc
- Page 41 and 42: (a) Actual traces.(a) Actual traces
- Page 43 and 44: Figure 2. Schematic picture of the
- Page 45 and 46: notion of partial/total immersion w
- Page 47 and 48: A Quantitative Evaluation Model of
- Page 49 and 50: a group of users. We, therefore, in
- Page 51 and 52: During the above discussion, user g
- Page 53 and 54: Usability Study of Indoor Mobile Na
- Page 56 and 57: 11%3%0%0%30%19%5%3%0%11%56%Notifyed
- Page 58 and 59: CONCLUSIONFor this study, we have d
- Page 60 and 61: are suited to provide sufficient cl
- Page 62 and 63: attention to your blood pressure re
- Page 64 and 65: invested in final development and c
- Page 66 and 67: 4. rhythms - the highly predictable
- Page 68 and 69: technologies, especially their coll
- Page 70 and 71: Situvis: Visualising Multivariate C
- Page 72 and 73: all. A user selecting a range withi
- Page 74 and 75:
constraints to include traces that
- Page 76 and 77:
Simulation Framework in Second Life
- Page 78 and 79:
VirtualEmitterEstimatedPositionVirt
- Page 80 and 81:
odies) then our approach would faci
- Page 82 and 83:
Design and Integration Principles f
- Page 84 and 85:
asis for interaction, it focuses on
- Page 86 and 87:
to support POST and GET messages as
- Page 88 and 89:
senor node’s functionalities in a
- Page 90 and 91:
service model and provide paradigms
- Page 92 and 93:
Virtualization of resources will fa
- Page 94 and 95:
Figure 3. Heating and lighting cont
- Page 96 and 97:
sat down at the same time” can be
- Page 98 and 99:
10!; !< !=!"#$%&’"()%* +$(%,-’#
- Page 100 and 101:
posite event operators such as conj
- Page 102 and 103:
objects? If these particles were sm
- Page 104 and 105:
Over-the-air-programmingOTAP (Over-
- Page 106 and 107:
Randomised Collaborative Transmissi
- Page 108 and 109:
Figure 2. Illustration of periodic
- Page 110 and 111:
Figure 3. Illustration of the recei
- Page 112 and 113:
Experimental Wired Co-operation Arc
- Page 114 and 115:
The structure of network is matrix
- Page 116 and 117:
Altera Quartus II v7.2SP3 FPGA soft
- Page 118 and 119:
Using smart objects as the building
- Page 120 and 121:
To the ambient ecology concepts des
- Page 122 and 123:
event. A more complicated approach
- Page 124 and 125:
Multi-Tracker: Interactive Smart Ob
- Page 126 and 127:
interfacing with whole space, but c
- Page 128 and 129:
operate interactions by many partic
- Page 130 and 131:
An Augmented Book and Its Applicati
- Page 132 and 133:
its withy material. So, the movemen
- Page 134 and 135:
Table 1. The Performance of Page Fl
- Page 136 and 137:
Ambient Information SystemsWilliam
- Page 138 and 139:
We, too, believe there is a certain
- Page 140 and 141:
management system and forwarded to
- Page 142 and 143:
CONCLUSIONWe have presented the des
- Page 144 and 145:
Ambient interface design for a Mobi
- Page 146 and 147:
as ‘sensitive’ and filtered awa
- Page 148 and 149:
Ambient Life: Interrupted Permanent
- Page 150 and 151:
The log files revealed the actual r
- Page 152 and 153:
Stay-in-touch: a system for ambient
- Page 154 and 155:
Figure 1. The Stay-in-touch display
- Page 156 and 157:
User Generated Ambient PresenceGerm
- Page 158 and 159:
Figure 3: Cross-platform system tra
- Page 160 and 161:
The Invisible Display - Design Stra
- Page 162 and 163:
On a more general level Mimikry cre
- Page 164 and 165:
Rand in fact few examples of public
- Page 166 and 167:
Ambient Displays in Academic Settin
- Page 168 and 169:
usefulProfiles 22 (37%)Time and dat
- Page 170 and 171:
UTILIZE THE POTENTIAL TO FULLEST: D
- Page 172 and 173:
A notification system for a landmin
- Page 174 and 175:
As for the investigation phase, whe
- Page 176 and 177:
Ubiquitous Sustainability: Citizen
- Page 178 and 179:
Live Sustainability: A System for P
- Page 180 and 181:
(a) (b) (c)Figure 3. Screenshot for
- Page 182 and 183:
Motivating Sustainable BehaviorIan
- Page 184 and 185:
dialog with policy makers and servi
- Page 186 and 187:
5. Fogg, B. J. (2002). “Persuasiv
- Page 188 and 189:
on mode of transportation such as t
- Page 190 and 191:
In order to be useful, PET requires
- Page 192 and 193:
provides a relevant description and
- Page 194 and 195:
Star, L. S., The Ethnography of Inf
- Page 196 and 197:
door environment, the accuracy of G
- Page 198 and 199:
Figure 5. Variations of “Parasiti
- Page 200 and 201:
can view where it has been, who ans
- Page 202 and 203:
Nevermind UbiquityJeff BurkeCenter
- Page 204 and 205:
innovating within existing capacity
- Page 206 and 207:
Since the Brundlandt report a serie
- Page 208 and 209:
our current research in mobile gami
- Page 210 and 211:
conceive only of human-computer int
- Page 212 and 213:
human behaviour will encounter. The
- Page 214 and 215:
mental models of the world are test
- Page 216 and 217:
alternative, the place that could h
- Page 218 and 219:
elow the mode button indicate the c
- Page 220 and 221:
212
- Page 222 and 223:
digital collection of features to a
- Page 224 and 225:
immediately followed by a subset of
- Page 226 and 227:
forecast the future accesses to tho
- Page 228 and 229:
Enhanced and Continuously Connected
- Page 230 and 231:
than two application windows execut
- Page 232 and 233:
Secure and Dynamic Coordination ofH
- Page 234 and 235:
AudioProducer/ConsumerVideoConsumer
- Page 236 and 237:
228
- Page 238 and 239:
MotivationsMotivations occurred on
- Page 240:
Kray Christian ··········