Prototype Location of No. of Sensor Motor Initialtactile interface tactile sensor contingency contingency mappingmodules and sensorsTactile Car Seat Back 1 (6) Ultrasound Sense close topographictargetsFeel the Force Waist 1 (8) Virtual Localize target topographicExploring Harmony Back 3 (48) Pitch Harmonic topographicSpaceimprovisationTable 1. A comparison of the three prototype devices that we are planning on building with our e tactile interfaceTactile Car SeatWe propose to design a car seat that will provide the driverwith a direct perceptual representation of objects in closeproximity to the vehicle. We will use an array of 6 vibrationmotors driven by the activation of 6 ultrasonic sensorspositioned on each side of the car at the front, middle andrear. The intensity of vibration will correspond to the proximityof objects to the associated sensor. We predict that withpractice this information might improve drivers’ situationalawareness and increase vehicle safety. This is an importantgoal: approximately 50000 reports on road accident injuriesor fatalities in the UK in 2005 listed failure to look properlyas a contributing factor to the accident and approximately1500 listed failure to see due to the vehicle blind spot [16].The idea of using tactile representations of information ina car is not a new one. Ho, Tan and Spence [7], for example,describe how vibrotactile warning signals can be used toalert drivers to dangers on the road. However, these systemsare designed to be attention grabbing and present informationonly at critical moments. We predict that presenting tactileinformation continuously through the car seat might increasethe driver’s feeling of connection to the car. In certainsituations this could be advantageous, for example, enhancinga driver’s ability to judge whether the car might t intoa tight parking space.We will test the prototype interface using two ‘quick anddirty’ evaluation methods, neither of which will require aperson to drive a real car. This is to avoid the heavy developmentoverheads associated with designing for a real vehicleor complex high-end driving simulator. Firstly we will usethe tactile interface to play ‘blind man’s buff’ games wherea blindfolded user seated in the lab has to detect the approachof people; and secondly, we will employ a Wizard-of-Oz approachlinking movement in an off-the-shelf PC driving simulatorwith activation of the vibration motor module. Whileobviously very different from driving a real sensor augmentedvehicle, these evaluation methods will enable us torapidly gauge the potential of this interface to guide actionand under what conditions it becomes transparent.Feel the ForceThis playful empirical study is inspired by the scene in StarWars Episode IV: A New Hope where Luke Skywalker isgetting his t training in the Force on the Millennium Falcon.He is wearing a helmet with an opaque visor that preventshim from seeing a g robot that moves around himand occasionally zaps him with an electric shock. He has to‘feel the Force’ in order to sense the position of the robotand block its zap with his light sabre.Each user will wear a cummerbund containing 8 equallyspaced vibration motors (45 degree separation). The user’s‘light sabre’ will consist of a Wii nunchuk connected toan Arduino microcontroller. Users will start in a ‘registration’position and then the system will track their movementsusing the 3 axis accelerometer in the nunchuk. The aim ofthe game is to move the nunchuk so that it blocks zaps froma virtual robot. Its movement will be indicated by changes inactivation across the array of vibration motors. A zap occurswhen the robot gets closer, indicated by an increase in vibrationintensity. If a user responds to this increase by movingthe nunchuk to the correct position then they will get forcefeedback from a vibration motor attached to the nunchuk,indicating that they have blocked the zap; if they move tothe wrong position then a number of vibration motors in thecummerbund will vibrate indicating they have been ‘hit’.We will measure how long it takes users to become cient in blocking zaps. If combined with interviews, then onemight be able to determine whether transparency, if achieved,is signalled by performance level. We can map any ofthe locations in virtual zap space to the vibration motors andexplore how different mappings affect users’ performance.We predict that the topographic representation, where adjacentvibration motors map to adjacent locations in space,will facilitate the best performance.Exploring Harmony SpaceWe plan to develop a system that uses Holland’s HarmonySpace system [8, 9] to provide a tactile spatial representationof harmonic structure to musicians learning to impro-4
vise. Beginning improvisers typically get stuck on ‘noodling’around individual chords from moment to moment andare unable to interact meaningfully with the strategic, longerterm harmonic elements, for example, chord progressionsand modulations, which are typically essential to higherlevelstructure in western tonal music, including jazz andmuch popular music.Harmony Space draws on cognitive theories of harmonicperception, providing consistent uniform spatial metaphorsfor virtually all harmonic phenomena, which can be translatedinto spatial phenomena such as trajectories, whoselength, direction and target all encode important information.Thus, Harmony Space enables numerous harmonic relationshipsto be re-represented in a way that may be morecognitively tractable.We will use the Harmony Space representation to providemusicians with a tactile representation of the harmonic relationshipsof music they are currently playing. This will beachieved by having the musicians wear a vest with a 6x8 arrayof tactile actuators where each actuator will represent anote that the musician is playing. The notes will be identi-d directly in the case of electronic instruments, or sensedusing microphones and pitch trackers in the case of acoustic(monophonic) instruments. We predict that representingpitch movement in this way will facilitate the developmentof a spatial understanding of musical relationships, whichwill transfer to improved performance in a wide variety ofmusical tasks, including improvisation. We will investigatewhether performance is linked to the interface becomingtransparent.CONCLUSIONThe E-Sense project is taking an interdisciplinary approachto investigating the extended mind, in particular the natureof sensory augmentation. We will use a rapid prototypingapproach to build 3 novel tactile interfaces that mediate differentsensory modalities (ultrasound, pitch and ‘virtual’ location).As well as testing the practical utility of these systems,we hope to gain more insight into the conditions underwhich technologies become transparent as well as gathermore evidence for the theoretical viability of the sensorimotorcontingency model.ACKNOWLEDGEMENTSThis research is supported by the Arts and Humanities ResearchCouncil grant number: AH/F011881/1.REFERENCES1. Bach-y-Rita, P. Brain Mechanisms in SensorySubstitution. Academic Press, NY, 1972.2. Arduino electronics prototyping platformhttp://www.arduino.cc/Retrieved June 20083. Bird, J., d’Inverno, M. and Prophet, J. Net Work: AnInteractive Artwork Designed Using anInterdisciplinary Performative Approach. DigitalCreativity, 18 (1), (2007), 11–23.4. Bird, J., Stokes, D., Husbands, P., Brown, P. and Bigge,B. Towards Autonomous Artworks. LeonardoElectronic Almanac, (forthcoming).5. Clark, A. Natural-Born Cyborgs: Minds, Technologiesand the Future of Human Intelligence. OxfordUniversity Press, NY, 2003.6. Heidegger, M. Being and Time. Harper and Row, NY,1962.7. Ho, C., Tan, H. Z. and Spence, C. Using spatialvibrotactile cues to direct visual attention in drivingscenes. Transportation Research Part F: TrafPsychology and Behaviour, 8, (2005), 397–412.8. Holland, S. l Intelligence, Education andMusic. PhD thesis, IET, The Open University, MiltonKeynes. Published as CITE report No. 88, 1989.9. Holland, S. Learning about harmony with HarmonySpace: An overview. In M. Smith and G. Wiggins,(Eds.) Music Education: An l IntelligenceApproach. Springer Verlag, London, 1994.10. Kohler, I. The formation and transformation of theperceptual world. Psychological Issues, 3, (1964),1-173.11. LilyPad sewable electronic componentshttp://www.cs.colorado.edu/ buechley/LilyPad/index.htmlRetrieved June 200812. Maravita, A. and Iriki, A. Tools for the Body (Schema)Trends in Cognitive Sciences, 8, (2004), 79–86.13. O’Regan, J. K. and Noë, A. A Sensorimotor Account ofVision and Visual Consciousness Behavioral and BrainSciences, 24(5), (2001), 939-73.14. Processing programming language and environmenthttp://www.processing.org/Retrieved June 200815. Ramachandran, V. S. and Blakeslee, S. Phantoms in theBrain: Probing the Mysteries of the Human Mind.Fourth Estate, London, 1998.16. Robinson, D. and Campbell, R. Contributory Factors toRoad Accidents Transport Statistics: Road Safety,Department for Transport, 2005. http://www.dft.gov.uk/Retrieved June 200817. Rogers, Y. and Muller, H. A Framework for DesigningSensor-Based Interactions to Promote Exploration and International Journal of Human-ComputerStudies, 64 (1) (2005), 1–15.18. Rogers, Y., Scaife, M. and Rizzo, A. Interdisciplinarity:an Emergent or Engineered Process? In S. Derry, C. D.Schunn and M. A. Gernsbacher (Eds.) InterdisciplinaryCollaboration: An Emerging Cognitive Science. LEA,(2005), 265–286.19. Stratton, G. M. Some preliminary experiments ofvision without inversion of the retinal image.Psychological Review, 3, (1896), 611-617.5
- Page 3 and 4: We would also like to extend a spec
- Page 5 and 6: ∙ USE‐03 Clinical Proof‐of‐
- Page 7 and 8: Ubiquitous Sustainability: Citizen
- Page 9 and 10: Devices that Alter Perception (DAP
- Page 11: incorporated into the body; if it d
- Page 15 and 16: Location-based Social Networking Sy
- Page 17 and 18: PrivacyWith any social networking a
- Page 19 and 20: BOXED EGO INSTALLATIONA pair of cam
- Page 21 and 22: the natural egocentric visuospatial
- Page 23 and 24: HUMAN SENSES AND ABSTRACT DANGERSOu
- Page 25 and 26: Gesture recognition as ubiquitous i
- Page 27 and 28: een developed to overcome this. For
- Page 29 and 30: 16. Myvu Corporation. Myvu Crystal.
- Page 31 and 32: especially in the fields of Ambient
- Page 33 and 34: 14. G. Riva (Editor), F. Vatalaro (
- Page 35 and 36: system, reflex arcs, and even muscl
- Page 37 and 38: 5. R. Cytowic. Synesthesia: Phenome
- Page 39 and 40: RELATED WORKLet us consider the loc
- Page 41 and 42: (a) Actual traces.(a) Actual traces
- Page 43 and 44: Figure 2. Schematic picture of the
- Page 45 and 46: notion of partial/total immersion w
- Page 47 and 48: A Quantitative Evaluation Model of
- Page 49 and 50: a group of users. We, therefore, in
- Page 51 and 52: During the above discussion, user g
- Page 53 and 54: Usability Study of Indoor Mobile Na
- Page 56 and 57: 11%3%0%0%30%19%5%3%0%11%56%Notifyed
- Page 58 and 59: CONCLUSIONFor this study, we have d
- Page 60 and 61: are suited to provide sufficient cl
- Page 62 and 63:
attention to your blood pressure re
- Page 64 and 65:
invested in final development and c
- Page 66 and 67:
4. rhythms - the highly predictable
- Page 68 and 69:
technologies, especially their coll
- Page 70 and 71:
Situvis: Visualising Multivariate C
- Page 72 and 73:
all. A user selecting a range withi
- Page 74 and 75:
constraints to include traces that
- Page 76 and 77:
Simulation Framework in Second Life
- Page 78 and 79:
VirtualEmitterEstimatedPositionVirt
- Page 80 and 81:
odies) then our approach would faci
- Page 82 and 83:
Design and Integration Principles f
- Page 84 and 85:
asis for interaction, it focuses on
- Page 86 and 87:
to support POST and GET messages as
- Page 88 and 89:
senor node’s functionalities in a
- Page 90 and 91:
service model and provide paradigms
- Page 92 and 93:
Virtualization of resources will fa
- Page 94 and 95:
Figure 3. Heating and lighting cont
- Page 96 and 97:
sat down at the same time” can be
- Page 98 and 99:
10!; !< !=!"#$%&’"()%* +$(%,-’#
- Page 100 and 101:
posite event operators such as conj
- Page 102 and 103:
objects? If these particles were sm
- Page 104 and 105:
Over-the-air-programmingOTAP (Over-
- Page 106 and 107:
Randomised Collaborative Transmissi
- Page 108 and 109:
Figure 2. Illustration of periodic
- Page 110 and 111:
Figure 3. Illustration of the recei
- Page 112 and 113:
Experimental Wired Co-operation Arc
- Page 114 and 115:
The structure of network is matrix
- Page 116 and 117:
Altera Quartus II v7.2SP3 FPGA soft
- Page 118 and 119:
Using smart objects as the building
- Page 120 and 121:
To the ambient ecology concepts des
- Page 122 and 123:
event. A more complicated approach
- Page 124 and 125:
Multi-Tracker: Interactive Smart Ob
- Page 126 and 127:
interfacing with whole space, but c
- Page 128 and 129:
operate interactions by many partic
- Page 130 and 131:
An Augmented Book and Its Applicati
- Page 132 and 133:
its withy material. So, the movemen
- Page 134 and 135:
Table 1. The Performance of Page Fl
- Page 136 and 137:
Ambient Information SystemsWilliam
- Page 138 and 139:
We, too, believe there is a certain
- Page 140 and 141:
management system and forwarded to
- Page 142 and 143:
CONCLUSIONWe have presented the des
- Page 144 and 145:
Ambient interface design for a Mobi
- Page 146 and 147:
as ‘sensitive’ and filtered awa
- Page 148 and 149:
Ambient Life: Interrupted Permanent
- Page 150 and 151:
The log files revealed the actual r
- Page 152 and 153:
Stay-in-touch: a system for ambient
- Page 154 and 155:
Figure 1. The Stay-in-touch display
- Page 156 and 157:
User Generated Ambient PresenceGerm
- Page 158 and 159:
Figure 3: Cross-platform system tra
- Page 160 and 161:
The Invisible Display - Design Stra
- Page 162 and 163:
On a more general level Mimikry cre
- Page 164 and 165:
Rand in fact few examples of public
- Page 166 and 167:
Ambient Displays in Academic Settin
- Page 168 and 169:
usefulProfiles 22 (37%)Time and dat
- Page 170 and 171:
UTILIZE THE POTENTIAL TO FULLEST: D
- Page 172 and 173:
A notification system for a landmin
- Page 174 and 175:
As for the investigation phase, whe
- Page 176 and 177:
Ubiquitous Sustainability: Citizen
- Page 178 and 179:
Live Sustainability: A System for P
- Page 180 and 181:
(a) (b) (c)Figure 3. Screenshot for
- Page 182 and 183:
Motivating Sustainable BehaviorIan
- Page 184 and 185:
dialog with policy makers and servi
- Page 186 and 187:
5. Fogg, B. J. (2002). “Persuasiv
- Page 188 and 189:
on mode of transportation such as t
- Page 190 and 191:
In order to be useful, PET requires
- Page 192 and 193:
provides a relevant description and
- Page 194 and 195:
Star, L. S., The Ethnography of Inf
- Page 196 and 197:
door environment, the accuracy of G
- Page 198 and 199:
Figure 5. Variations of “Parasiti
- Page 200 and 201:
can view where it has been, who ans
- Page 202 and 203:
Nevermind UbiquityJeff BurkeCenter
- Page 204 and 205:
innovating within existing capacity
- Page 206 and 207:
Since the Brundlandt report a serie
- Page 208 and 209:
our current research in mobile gami
- Page 210 and 211:
conceive only of human-computer int
- Page 212 and 213:
human behaviour will encounter. The
- Page 214 and 215:
mental models of the world are test
- Page 216 and 217:
alternative, the place that could h
- Page 218 and 219:
elow the mode button indicate the c
- Page 220 and 221:
212
- Page 222 and 223:
digital collection of features to a
- Page 224 and 225:
immediately followed by a subset of
- Page 226 and 227:
forecast the future accesses to tho
- Page 228 and 229:
Enhanced and Continuously Connected
- Page 230 and 231:
than two application windows execut
- Page 232 and 233:
Secure and Dynamic Coordination ofH
- Page 234 and 235:
AudioProducer/ConsumerVideoConsumer
- Page 236 and 237:
228
- Page 238 and 239:
MotivationsMotivations occurred on
- Page 240:
Kray Christian ··········